Mabrouk Gadri, Developer in Dubai, United Arab Emirates
Mabrouk is available for hire
Hire Mabrouk

Mabrouk Gadri

Verified Expert  in Engineering

Data Warehouse Architect and Developer

Dubai, United Arab Emirates

Toptal member since July 12, 2022

Bio

Mabrouk is a data warehouse architect with twelve years of experience, and he enjoys his job like the first day. He has hands-on experience modeling and developing data warehouses on top of leading systems like SQL Server, Google Cloud BigQuery, Snowflake, and Databricks. Mabrouk uses ETL vendors like Informatica and Talend for data integration, as well as Python scripting and Apache Spark. He is skillful in using SQL and the data build tool (DBT) for data transformations.

Portfolio

Fivetran
Fivetran, Data Build Tool (dbt), SQL, Snowflake, BigQuery, Python, Databricks
Veolia
Google Cloud Platform (GCP), BigQuery, SQL, Python, Bash, CI/CD Pipelines...
Aryason
Snowflake, Amazon Web Services (AWS), Python, SQL, Data Modeling, APIs...

Experience

  • SQL - 12 years
  • SQL Server BI - 5 years
  • Spark - 4 years
  • Python - 4 years
  • Snowflake - 3 years
  • BigQuery - 3 years
  • Data Build Tool (dbt) - 3 years
  • Talend ETL - 3 years

Availability

Part-time

Preferred Environment

Google Cloud, Snowflake, SQL, Python, Amazon Web Services (AWS), Azure

The most amazing...

...experience I've had was at L'Oréal, where I designed and developed key parts of its big data platform using Google Cloud BigQuery and Cloud Run.

Work Experience

Solution Engineer

2022 - PRESENT
Fivetran
  • Helped our customers develop and optimize their Snowflake SQL queries and dbt models once they integrated the data using Fivetran.
  • Advised our customers about Architectural and modeling choices for their modern data platforms.
  • Ran enabling sessions on Fivetran's core offering, dbt, and Snowflake.
Technologies: Fivetran, Data Build Tool (dbt), SQL, Snowflake, BigQuery, Python, Databricks

Lead Data Engineer | Architect

2021 - 2022
Veolia
  • Designed an event-driven new data platform using Google Cloud technologies.
  • Developed ETL pipelines using Google Cloud Run, publish and subscribe messaging (Pub/Sub), SQL, Dataform, and BigQuery.
  • Audited, optimized cost and query performance on BigQuery, and made recommendations to users on how to model and query their data daily.
  • Assisted the development team and power users in their upskill journey on Google Cloud Platform (GCP) and using the new data platform.
  • Designed and implemented a rule evaluation engine to determine which ETL workflows to trigger depending on which data sources were refreshed.
  • Designed and implemented a CI/CD workflow to secure code deployment from GitHub to Google Cloud.
Technologies: Google Cloud Platform (GCP), BigQuery, SQL, Python, Bash, CI/CD Pipelines, GitHub, Bash Script, Apache Airflow, Terraform, Google BigQuery, Big Data

Data Architect

2020 - 2021
Aryason
  • Designed and developed a packaged cloud data warehouse and a BI solution for eCommerce businesses using Snowflake, DBT, and Looker.
  • Modeled the sales and inventory data warehouse using the star schema approach.
  • Developed data pipelines to extract data from WooCommerce and Shopify eCommerce platforms, refreshed the data warehouse, and updated Looker dashboards.
  • Defined and optimized resource allocation, including databases, accounts, and warehouses on Snowflake.
Technologies: Snowflake, Amazon Web Services (AWS), Python, SQL, Data Modeling, APIs, Data Build Tool (dbt), Looker, Apache Airflow, Terraform, Google BigQuery, Big Data

Senior Data Engineer

2019 - 2021
L'Oreal
  • Designed and developed data analytics solutions to normalize and unify L'Oréal product master data and optimize the product design process.
  • Developed key modules of the new cloud data platform featuring a data flow synchronization service and an airflow DAG Generator that orchestrates BigQuery SQL jobs.
  • Designed and developed semantic layers on Power BI using the DAX language.
  • Modeled the data warehouse and data marts for the L'Oréal product division on SQL Server and BigQuery.
  • Developed Apache Spark pipelines and SQL batches to feed the data warehouse.
Technologies: Apache Spark, Apache Hive, SQL Server 2016, Microsoft Power BI, Google Cloud Platform (GCP), BigQuery, SQL, Python, Apache Airflow, Terraform, Google BigQuery, PySpark, Big Data

Senior Data Engineer | Architect

2017 - 2019
SFR Group
  • Migrated the group marketing data warehouse from Informatica, Teradata, and SQL Server Analysis Services (SSAS) to a Hadoop, Spark, and Hive-based solution.
  • Used the new tech stack to model data and develop ETL batches based on the new business requirements.
  • Resolved performance problems caused by large data volumes.
  • Maintained the existing legacy ETL workflows and developed hive SQL batches for loading a Hive and Hadoop-based data warehouse.
Technologies: Teradata, SQL Server BI, Informatica ETL, Hadoop, Apache Hive, Spark, Python, Talend ETL, Bash, Big Data

Senior Data Engineer

2015 - 2017
L'Oreal
  • Modeled the part of the data warehouse which addresses the supply chain business entity BI needs.
  • Developed the online analytical processing (OLAP) database for analyzing sales and purchase orders using SSAS and MDX. Trained analysts to query SSAS cubes.
  • Defined security rules for accessing SQL Server tables and OLAP cubes to be applied by BI analysts.
  • Developed and maintained ETL workflows that extract SAP error-correcting code (ECC) data and load the data warehouse.
Technologies: SQL Server 2012, Informatica ETL, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS)

Senior Data Engineer

2013 - 2015
Orange
  • Modeled different data marts for the group's main divisions based on the business requirements in terms of dashboards and an OLAP analysis.
  • Developed ETL workflows and stored procedures to feed the created data marts.
  • Assisted the team in developing and debugging ETL packages.
  • Validated data integrity and accuracy in the data warehouse.
Technologies: SQL Server BI, SQL Server 2012, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), SQL

Data Engineer

2011 - 2013
Societe Generale
  • Edited technical specifications and best practices for developing the investment bank customer relationship management (CRM) migration solution after gathering business requirements.
  • Developed ETL pipelines that migrate data from each legacy application to the new centralized CRM.
  • Resolved critical production server performance issues caused by large data volumes.
  • Developed dashboards for controlling the integrity and quality of migrated data.
  • Validated a new CRM data structure and functionalities on the new CRM screens.
Technologies: SQL Server 2008 R2, SQL, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS)

Database and Dashboard Developer

2011 - 2011
Saur
  • Defined technical specifications for developing complex data-driven reports based on business needs.
  • Developed interactive dashboards and deployed them on a web portal.
  • Assisted users in developing simple reports and developers in developing complex ones. Animated training sessions for business users and developers.
Technologies: SQL Server 2008 R2, SQL, SQL Server Reporting Services (SSRS)

Data Warehouse Developer

2010 - 2011
Manutan
  • Modeled and developed part of the company's analytical database and OLAP cubes for analyzing and controlling performance.
  • Edited user manual and technical specification documents of the business intelligence platform.
  • Implemented and maintained ETL pipelines to feed the company data warehouse.
Technologies: SQL Server BI, SQL Server 2008, SQL, SQL Server Analysis Services (SSAS)

Data Platform for Veolia Recycling Division

I designed and developed a brand new cloud data platform for Veolia recycling and waste management division using BigQuery as the cloud database technology, SQL as a primary means to transform data, Cloud Run or Google Pub/Sub to react to data arrival events and orchestrate subsequent pipelines, and Google Cloud Build to deploy new features following the CI/CD approach. Additionally, I trained fellow developers on the Google Cloud Platform and suggested tips and best practices when modeling and querying the data in an OLAP context.

Data Warehouse Modeling on BigQuery

https://mabrouk-gadri.medium.com/rethinking-traditional-data-warehouse-design-with-bigquery-5cd030fdb149
This is an article on rethinking the data warehouse design with BigQuery. I conducted an in-depth analysis of BigQuery features and how to use them when modeling a data warehouse. Finally, I wrote an extended article that highlights advantages and disadvantages of popular data modeling choices and how to find a balance between them.

Snowflake Data Integration Performance Evaluation

https://mabrouk-gadri.medium.com/bringing-data-into-snowflake-part-1-copy-statement-performance-tests-9aad97c96409
This article draws conclusions about Snowflake data loading performance after a series of tests and an analysis. The analysis included running copy statements from an Amazon S3 bucket and playing with variables like warehouse size, file count, and average file size.
2004 - 2009

Master's Degree in Computer Science

National School of Computer Sciences | University of la Manouba - Manouba, Tunisia

MARCH 2021 - MARCH 2023

GCP Professional Cloud Architect

GCP

Libraries/APIs

PySpark

Tools

BigQuery, SQL Server BI, Talend ETL, GitHub, Microsoft Power BI, Apache Airflow, Looker, Informatica ETL, Terraform

Languages

SQL, Snowflake, Python, Bash, Bash Script

Frameworks

Spark, Apache Spark, Hadoop

Platforms

Databricks, Amazon Web Services (AWS), Azure, Google Cloud Platform (GCP)

Storage

Google Cloud, Databases, PostgreSQL, Apache Hive, SQL Server 2016, Teradata, SQL Server 2012, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), SQL Server 2008 R2, SQL Server 2008

Other

Google BigQuery, Big Data, Software Engineering, Data Build Tool (dbt), Data Modeling, APIs, CI/CD Pipelines, Fivetran

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring