Gabriel Rositto, Developer in Buenos Aires, Argentina
Gabriel is available for hire
Hire Gabriel

Gabriel Rositto

Verified Expert  in Engineering

Systems Engineer and Developer

Location
Buenos Aires, Argentina
Toptal Member Since
October 20, 2020

Gabriel is a systems engineer, and since 2011, he's occupied a range of roles related to business intelligence and data integration in general. Gabriel’s expertise includes various languages (SQL), paradigms (business intelligence), storage (Redshift), among others. Thanks to a drive to understand how things work at the most granular level, Gabriel thrives on solving challenging technical problems and investigating issues that most people overlook as "too complex" or "not worth it."

Portfolio

Lucky App
PostgreSQL, Linux, Data Visualization, ETL, Data Engineering...
Charles Taylor InsureTech
SQL Server Integration Services (SSIS), Visual Studio, SQL Server 2019...
OLX Group
Linux, Data Visualization, ETL, Data Engineering, Data Warehousing...

Experience

Availability

Part-time

Preferred Environment

Amazon Web Services (AWS), REST, Tableau, Python, SQL

The most amazing...

...solution I've developed was building an end-to-end BI setup by myself—from setting up infrastructure all the way to reporting using AWS.

Work Experience

Freelance BI Lead

2018 - PRESENT
Lucky App
  • Set up the data warehouse from scratch, including creating and configuring an AWS EC2 instance, writing the pipelines in Python, loading it into Redshift, and creating the datasets in AWS QuickSight.
  • Provided ongoing support for the new requirements and existing processes.
  • Integrated sources after the initial set up with Amplitude, AppsFlyer, Zoho CRM, and others and consolidated them as a single user view in the data warehouse.
  • Developed the executive reporting for business investors, which illustrates the main KPI progress for all business areas.
  • Designed a working prototype of a financing installments model, including payment and user blocking.
Technologies: PostgreSQL, Linux, Data Visualization, ETL, Data Engineering, Data Warehouse Design, Data Warehousing, Google Sheets, Amplitude, Zoho CRM, JSON, REST, Python, Amazon QuickSight, Redshift, Amazon S3 (AWS S3), Amazon EC2, Unix Shell Scripting, Amazon Web Services (AWS), Jira, SQL, Business Intelligence (BI), Database Development, Apache Airflow, Dashboards, Dashboard Development

ETL Developer

2020 - 2021
Charles Taylor InsureTech
  • Developed, tested, and reviewed SSIS packages to implement the required ETL setup for the project.
  • Built client-facing metadata objects via Microsoft's Master Data Services software and ensured business rules and validations.
  • Worked as part of the development team using Visual Studio, collaborating through Git, and organizing the workload in Azure DevOps and the Microsoft Cloud Suite.
Technologies: SQL Server Integration Services (SSIS), Visual Studio, SQL Server 2019, Azure DevOps, ETL, SQL, Business Intelligence (BI), Data Warehouse Design, Data Warehousing, Data Engineering

Lead Data Engineer

2019 - 2020
OLX Group
  • Migrated existing ETL jobs in Pentaho to a Python and SQL configuration orchestrated by Jenkins jobs.
  • Migrated the visualization layer from MicroStrategy to Tableau due to changing the corporate reporting tool.
  • Managed the BI team's workload with one direct report by working with Agile (Jira) and providing space for feedback.
  • Developed integration scripts to source from Salesforce, Google Ad Manager, Google Sheets, and other internal systems.
  • Maintained the existing BI setup with AWS EC2 Server, AWS S3 Buckets, and Amazon Redshift as the corporate data warehouse by developing new scripts and fixing them as required.
Technologies: Linux, Data Visualization, ETL, Data Engineering, Data Warehousing, Data Warehouse Design, Jira, Pentaho, MicroStrategy, Amazon S3 (AWS S3), REST, Python, Salesforce, Jenkins, Tableau, Redshift, Unix Shell Scripting, Amazon Web Services (AWS), SQL, Business Intelligence (BI), Database Development, Dashboards, Dashboard Development

BI Specialist

2018 - 2019
OLX Group
  • Worked on a project to offload large data from Redshift onto a file-system solution, including Hive, Presto, Parquet, and AWS Athena, although this was abandoned right before implementation.
  • Developed the back-and-forth integration between Salesforce and Redshift via REST APIs in Python, which updated the main objects with BI data and reported sales data.
  • Enhanced the custom fields, formulas, Apex triggers and Visualforce pages for the sales and operations team usage in Salesforce.
  • Assisted the local Latin American BI team with developing and maintaining ETL jobs in Pentaho and reporting changes in MicroStrategy.
Technologies: Linux, Data Visualization, ETL, Data Engineering, Data Warehousing, Data Warehouse Design, JSON, Parquet, Apache Hive, AWS Glue, Amazon Athena, Amazon S3 (AWS S3), Amazon WorkSpaces, Jenkins, Pentaho, MicroStrategy, Redshift, REST, Visualforce, Salesforce Apex, Presto DB, Python, Unix Shell Scripting, Amazon Web Services (AWS), SQL, Business Intelligence (BI), Database Development, Dashboards, Dashboard Development

BI Manager

2017 - 2018
Dubizzle
  • Led a team of two people organizing work in agile sprints (Jira), providing performance feedback, following up on issues, and creating development plans.
  • Met frequently with business stakeholders to gather requirements and plan BI involvement in major business initiatives.
  • Developed quality checks for main business KPIs, including data gaps, variation vs. history, and structured process logging via auxiliary tables.
  • Provided a dedicated dashboard suite for revenue reporting with automatic comparisons against source data and several business rules, guaranteeing 100% accuracy and availability for critical needs.
Technologies: Linux, Data Visualization, ETL, Data Engineering, Data Warehouse Design, Data Warehousing, Amazon EC2, Amazon S3 (AWS S3), Pentaho, Jira, SQL, Python, MicroStrategy, Redshift, AWS Glue, Unix Shell Scripting, Amazon Web Services (AWS), Business Intelligence (BI), Database Development, Dashboards, Dashboard Development

Data Engineer

2015 - 2017
Dubizzle
  • Developed, scheduled, and maintained ETL jobs in Pentaho, which fetched data from the local MENA product and loaded them into the global data warehouse solution in Amazon Redshift.
  • Validated the data existing in the global solution against source data and interacted with local teams when issues arose.
  • Worked on fulfilling BI-related requests for local business stakeholders, including product, finance, and marketing teams.
  • Documented the main processes in the local infrastructure and all the related details in Confluence.
Technologies: Linux, ETL, Data Engineering, Data Warehouse Design, Data Warehousing, Confluence, Jira, Analytics, Amazon S3 (AWS S3), Redshift, SQL, MicroStrategy, Pentaho, Unix Shell Scripting, Business Intelligence (BI), Database Development, Dashboards, Dashboard Development, Data Visualization

ETL Developer

2014 - 2015
Citi
  • Developed ETL jobs using Ab Initio and Unix shell scripting.
  • Validated ETL job statuses and results by using control tables stored in an Oracle database, controlling job execution by updating these tables.
  • Deployed the ETL jobs from development into the UAT environment through Unix scripts.
  • Performed validations by querying a test Teradata server before deployment on a live environment.
Technologies: Linux, ETL, Data Engineering, Data Warehouse Design, Data Warehousing, Bash, Unix, Teradata, Oracle, Ab Initio, Unix Shell Scripting, SQL, Business Intelligence (BI), Database Development

Teradata Database Administrator (DBA)

2012 - 2013
Visa
  • Collaborated with a team that managed the enterprise data warehouse (DW) set up in Teradata, including user and workload management.
  • Developed ETL jobs using Perl and later migrated them to PowerCenter.
  • Implemented query tuning and performance optimization on data warehouse objects, including indices, explained plans, and space usage.
Technologies: ETL, Data Warehouse Design, Data Warehousing, Control-M, Perl, Informatica PowerCenter, Teradata, SQL, Business Intelligence (BI), Database Development

Programmer Analyst

2011 - 2012
Visa
  • Developed ad-hoc queries in Teradata to solve business requirements and segmentations.
  • Automated recurring requirements via DTS projects and SQL Server 2000.
  • Created MicroStrategy reports for sharing results when required and provided reporting as Microsoft Excel files.
Technologies: Data Analysis, Data Warehouse Design, Data Warehousing, MicroStrategy, Microsoft Excel, Teradata, DTS, Analytics, SQL

Freelance Projects

https://github.com/gabrielrositto217/freelance-work
Over the last two years, I've developed a repository containing various freelance projects, primarily produced in SQL with company names removed for privacy purposes. All of the code is exclusively authored and owned by myself.

Lucky App | Business Executive Dashboard

A set of visualizations placed together in an Amazon QuickSight dashboard, showcasing the evolution for the main KPI of the business Lucky App, an offer provider in Egypt.

Some Features of the Setup:
• Data used in the visualizations is pre-aggregated at the day, month and level, with the ability to dynamically change the slice from the dashboard
• Business targets are also displayed when applicable for the user to visualize target achievement quickly
• Thorough quality checks and reviews with the business are shared monthly with investors as part of the information

I developed the dashboard and the underlying database by integrating all the available sources using Python, loading them into an Amazon Redshift cluster, and connecting QuickSight to use the data.

BI | Salesforce Integration Project

In one of my previous jobs, the requirement was to build a daily load from data existing in the BI set up for several countries and regions into a unified Salesforce instance. Our goal was to ensure timely and accurate data updates for the sales teams to have updated stats about their users for upselling and support activities.

The project was finished successfully while also dealing with other BAU requests, which were challenging and demanding long hours.

Some Challenges and Details:
• Most countries had different business rules and data sources, which required several calls with different regional owners to understand the data model and regional particularities
• Timezone differences as active regions spanned from Buenos Aires to Jakarta made collaboration difficult; we kept calls to a minimum and took most things to email. The team building the Salesforce instance was a consultancy firm based in India
• The database extract was implemented in Python, orchestrated using Jenkins, and involved sending information via SFTP to a server running the load into Salesforce using Talend
• The data validation and adjustments in Salesforce, together with the business stakeholders, were also part of my responsibility

Languages

SQL, Python, Bash, UML, Perl

Paradigms

Business Intelligence (BI), ETL, Database Development, REST, Object-oriented Programming (OOP), Azure DevOps

Storage

Redshift, Amazon S3 (AWS S3), JSON, Teradata, Databases, Apache Hive, PostgreSQL, SQL Server Integration Services (SSIS)

Other

Data Warehousing, Data Engineering, Data Warehouse Design, MicroStrategy, Data Visualization, Unix Shell Scripting, Dashboards, Dashboard Development, System Design, Data Architecture, Artificial Intelligence (AI), Salesforce Apex, Parquet, Analytics, Amplitude, Data Analysis, SQL Server 2019, FTP

Tools

Microsoft Excel, Amazon QuickSight, Tableau, Jenkins, Jira, Amazon WorkSpaces, Amazon Athena, AWS Glue, Confluence, Ab Initio, Informatica PowerCenter, Control-M, DTS, Google Sheets, Visual Studio, Apache Airflow

Platforms

Pentaho, Amazon EC2, Unix, Linux, AWS Lambda, Salesforce, Oracle, Zoho CRM, Amazon Web Services (AWS)

Frameworks

Presto DB, Visualforce

2007 - 2011

Bachelor's Degree in Systems Engineering

Universidad Tecnológica Nacional - Buenos Aires, Argentina

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring