Moustafa Sadek, Developer in Eindhoven, Netherlands
Moustafa is available for hire
Hire Moustafa

Moustafa Sadek

Verified Expert  in Engineering

Bio

Moustafa is a highly skilled data analytics engineer with 5+ years of experience in full-stack analytics, from business requirement gathering to data modeling in Databricks, DBT, and BigQuery. He's an expert in Power BI, Looker Studio, ETL, CI/CD with Azure DevOps, and Azure Data Factory pipeline orchestration. Moustafa is an Agile/Scrum practitioner with strong documentation and stakeholder onboarding skills.

Portfolio

SHIMANO EUROPE B.V.
Agile, Agile DevOps, Analytical Thinking, Data Engineering, Data Analysis...
Brightly - Main
Data Analysis, SQL, Excel 365, Python, ETL, Microsoft Power BI, Tableau...
Henkel
Azure Databricks, Data Analysis, Statistics, Microsoft Azure...

Experience

  • Spreadsheets - 7 years
  • Microsoft Power BI - 6 years
  • SQL - 6 years
  • Python - 6 years
  • Data Analytics - 5 years
  • Data Modeling - 4 years
  • Tableau - 3 years
  • Azure Databricks - 2 years

Availability

Part-time

Preferred Environment

Azure Databricks, SQL, Azure, Google BigQuery, Microsoft Power BI, Tableau, ADF, ETL, Business Intelligence (BI), Python

The most amazing...

...things I've developed were 15+ full-stack dashboards for inventory, demand planning, and logistics, enhancing operational efficiency and data visibility.

Work Experience

Senior Data Analytics Engineer

2024 - 2025
SHIMANO EUROPE B.V.
  • Data Analytics Engineer with expertise in creating ADF pipelines in Azure to implement ETL tools. Developing Python data models for advanced analytics and data processing.
  • Power BI dashboard creation for data visualization and reporting.
  • Building SQL Views to streamline data access and analysis. Monitoring data migration projects from JDE to SAP.
Technologies: Agile, Agile DevOps, Analytical Thinking, Data Engineering, Data Analysis, Data Modeling, ETL, Azure, Azure Data Factory (ADF), Python, C#, SQL, Microsoft SQL Server, Visual Studio, Microsoft Power BI, DAX, Looker Studio, Google Cloud, Google Analytics, BigMachines Query Language (BMQL), Business Intelligence (BI), Databricks, Stakeholder Management, Microsoft Office, Excel Expert, Data Visualization, Data Cleaning, Data Analytics, ETL Tools, Dashboards, Data Pipelines, Excel 365, Business Analysis, Azure Data Lake, ADF, Spyder, Data Warehousing, Data Migration, Data Migration Testing, dbt Cloud, Microsoft Power BI, ODBC, Single Sign-on (SSO), COBOL, PowerBuilder, Microsoft, Scala, Oracle, BigQuery, Jupyter, Fivetran, Microsoft Excel, Excel Macros, Financial Modeling

Data Analyst

2024 - 2025
Brightly - Main
  • Specialized in building insightful dashboards using Power BI.
  • Collaborated closely with product owners to gather and refine requirements, translating business needs into actionable metrics.
  • Performed mathematical calculations, developed robust data models, and transformed complex datasets into clear, interactive visualizations.
  • Built over 25 ETL pipelines to extract and model the data using DBT and Data Bricks, and developed more than seven Dashboards in Power BI.
Technologies: Data Analysis, SQL, Excel 365, Python, ETL, Microsoft Power BI, Tableau, Data Build Tool (dbt), Databricks, Azure, Azure Data Factory (ADF), Agile DevOps, Spark, Business Intelligence (BI), SAP, Salesforce Sales Cloud, Microsoft SQL Server, Jupyter Notebook, Microsoft Office, Excel Expert, Data Visualization, Data Cleaning, Data Analytics, ETL Tools, DAX, Dashboards, Data Pipelines, Business Analysis, Azure Data Lake, Stakeholder Management, ADF, Spyder, Data Warehousing, Visual Studio, Data Migration, Data Migration Testing, Looker, dbt Cloud, Microsoft Power BI, ODBC, Single Sign-on (SSO), COBOL, PowerBuilder, Microsoft, Scala, Oracle, BigQuery, Jupyter, Fivetran, Microsoft Excel, Excel Macros, Financial Modeling

Senior Data Analytics Engineer

2020 - 2024
Henkel
  • Developed a data model and a dashboard to monitor the product lifecycle in the logistics network regarding quantity and cost.
  • Planned features and product backlog items (PBIs) via Azure DevOps, created a data model in Azure Databricks, and presented the dashboard in Microsoft Power BI.
  • Optimized and reduced costs through multiple factors, such as truck utilization, which also decreased CO2 emissions.
  • Created a dashboard to track and monitor forecast accuracy for regional and monthly levels.
Technologies: Azure Databricks, Data Analysis, Statistics, Microsoft Azure, Microsoft Power BI, Tableau, Agile, PySpark, Python, SQL, SSAS Tabular, Big Data, Stakeholder Engagement, Sprint Planning, Agile DevOps, User Acceptance Testing (UAT), Dashboards, SAP, Communication, Diversity, Equity, & Inclusion (DEI), CI/CD Pipelines, Data Pipelines, Data Engineering, Business Intelligence (BI), Image Annotation, Data Transformation, Node.js, PostgreSQL, Machine Learning, Amazon Web Services (AWS), ETL, Data Extraction, Unit Testing, Databricks, Apache Airflow, Data Build Tool (dbt), Microsoft Excel, Excel Add-ins, REST APIs, ECharts, Product Management, SAP ERP, Data Analytics, Data Product Manager, Microsoft Power BI, Data Lakes, Snowflake, Microsoft Fabric, ELT, Financial Planning & Analysis (FP&A), SEC Financial Reporting, Financial Reporting, Jupyter Notebook, Microsoft Office, Excel Expert, Data Visualization, Data Cleaning, Data Analytics, ETL Tools, DAX, Excel 365, Business Analysis, Azure Data Lake, Stakeholder Management, Microsoft SQL Server, Azure Data Factory (ADF), ADF, Spyder, Supply Chain Operations, Optimization, Data Wrangling, Data Warehousing, Visual Studio, Data Migration, Data Migration Testing, Looker, dbt Cloud, ODBC, Single Sign-on (SSO), COBOL, PowerBuilder, Microsoft, Scala, Oracle, BigQuery, Jupyter, Excel Macros

Data Analyst

2019 - 2020
Noon
  • Developed a dashboard to track the number of damaged shipments by defining each shipment's location and timestamps.
  • Set weekly and monthly KPIs and analyzed them statistically.
  • Created queries and views using SQL, BigQuery, and Data Studio.
  • Analyzed distribution demand using Python and R to gain insights and facilitate decision-making.
  • Built data models using statistics and programming.
Technologies: SQL, Google BigQuery, Looker Studio, Tableau, Google Sheets, Python, Logistics, Agile DevOps, Google Cloud, Excel 365, Excel VBA, Business Analysis, Forecasting, Complex Problem Solving, eCommerce, Statistical Data Analysis, ETL, Teamwork, IT Project Management, QuickBase, BigQuery, Business Intelligence (BI), Image Annotation, Data Transformation, Google Cloud Platform (GCP), Amazon Web Services (AWS), Data Extraction, Apache Airflow, Data Build Tool (dbt), Microsoft Excel, Excel Add-ins, REST APIs, ECharts, Flask, Product Management, SAP ERP, Data Analytics, Data Product Manager, Microsoft Power BI, Data Lakes, Qlik, Data Engineering, ELT, Financial Planning & Analysis (FP&A), SEC Financial Reporting, Jupyter Notebook, Microsoft Office, Data Visualization, Data Cleaning, Data Analytics, ETL Tools, DAX, Dashboards, Data Pipelines, Azure Data Lake, Stakeholder Management, Google Analytics, Microsoft SQL Server, ADF, Spyder, Supply Chain Operations, Optimization, Data Wrangling, Data Warehousing, Visual Studio, Looker, dbt Cloud, ODBC, Single Sign-on (SSO), COBOL, PowerBuilder, Microsoft, Scala, Oracle, Jupyter, Excel Macros

Experience

Cost-to-Deliver Dashboard

The Cost-to-Deliver Dashboard is a comprehensive tool designed to monitor and manage the cost and quantities of goods within a logistics network. It provides real-time data on the cost of delivering goods, enabling businesses to make informed decisions to optimize their supply chain and reduce overall costs.

The dashboard also tracks the quantities of goods in transit, providing insights into inventory levels and helping to prevent overstocking or understocking. This feature aids in maintaining a balanced inventory, reducing storage costs and waste.

Moreover, the dashboard is environmentally conscious as it tracks the CO2 emissions associated with the transportation of goods, encouraging businesses to adopt greener logistics practices. By reducing CO2 emissions, businesses can not only lower their environmental impact but also potentially benefit from cost savings in regions where carbon pricing is in effect.

Forecast Accuracy Dashboard

The Forecast Accuracy dashboard is a robust tool designed to monitor and analyze the precision of predictions on a regional and monthly basis. It provides a comprehensive view of forecasting performance, enabling users to identify trends, patterns, and deviations over time.

The dashboard is divided into two main sections: regional accuracy and monthly accuracy. The regional accuracy section provides data on the accuracy of forecasts for different regions, allowing for comparative analysis across various geographical areas. This helps identify regions where forecasts are consistently accurate or require further attention. The monthly accuracy section, on the other hand, tracks the accuracy of forecasts on a month-by-month basis. This allows users to observe how forecast accuracy fluctuates over time and can help identify specific months where forecasts tend to be more or less accurate.

Data Analyst Work

At Brightly, I served as a data analyst, where I specialized in building insightful dashboards using Power BI. I collaborated closely with product owners to gather and refine requirements, translating business needs into actionable metrics. My responsibilities included performing mathematical calculations, developing robust data models, and transforming complex datasets into clear, interactive visualizations. This work enabled stakeholders to make data-driven decisions and track key performance indicators effectively.

Education

2015 - 2020

Bachelor's Degree in Engineering

Alexandria University - Alexandria, Egypt

Certifications

MAY 2022 - PRESENT

Algorithms and Data Structures MicroMasters Program

Rochester Institute of Technology | via edX

JUNE 2021 - PRESENT

MITx MicroMasters Certificate in Data Analytics

Massachusetts Institute of Technology | via edX

MARCH 2020 - PRESENT

MITx MicroMasters Certificate in Supply Chain Management

Massachusetts Institute of Technology | via edX

Skills

Libraries/APIs

PySpark, NumPy, Pandas, Node.js, REST APIs, ODBC

Tools

Microsoft Power BI, Tableau, Spreadsheets, Google Sheets, BigQuery, Microsoft Excel, Looker, Jupyter, Spyder, Apache Airflow, Salesforce Sales Cloud, Visual Studio, Google Analytics, dbt Cloud

Languages

SQL, Python, Excel VBA, COBOL, PowerBuilder, Snowflake, R, C#, Scala, JavaScript, BigMachines Query Language (BMQL)

Paradigms

ETL, Business Intelligence (BI), Unit Testing, Agile, User Acceptance Testing (UAT)

Platforms

Databricks, Microsoft, Jupyter Notebook, Azure, Anaconda, Google Cloud Platform (GCP), Amazon Web Services (AWS), Microsoft Fabric, Qlik, Oracle, AWS IoT

Storage

QuickBase, PostgreSQL, Data Lakes, Microsoft SQL Server, SSAS Tabular, Data Pipelines, Google Cloud

Frameworks

ADF, Spark, Flask

Other

Azure Databricks, Google BigQuery, Supply Chain Management (SCM), Mathematics, Statistics, Data Modeling, Microsoft Office, Excel Expert, Data Cleaning, Data Analytics, ETL Tools, Data Analysis, Microsoft Azure, DAX, Dashboards, CI/CD Pipelines, Excel 365, Business Analysis, Data Visualization, Data Engineering, Image Annotation, Data Transformation, Data Extraction, Excel Add-ins, ECharts, Product Management, SAP ERP, Data Analytics, Data Product Manager, Microsoft Power BI, ELT, Stakeholder Management, Data Migration, Single Sign-on (SSO), Excel Macros, Industrial Engineering, IT Project Management, Statistical Modeling, Statistical Analysis, Communication, Statistical Programming, Analytical Thinking, Demand Planning, Logistics, Order to Cash (O2C), Supply Chain Operations, Optimization, Big Data, Data Wrangling, Data Warehousing, Stakeholder Engagement, Agile DevOps, SAP, Diversity, Equity, & Inclusion (DEI), Looker Studio, Forecasting, Complex Problem Solving, eCommerce, Statistical Data Analysis, Teamwork, ERD, Machine Learning, Genesys, Data Build Tool (dbt), Data Science, Financial Planning & Analysis (FP&A), SEC Financial Reporting, Financial Reporting, Azure Data Factory (ADF), Azure Data Lake, Data Migration Testing, Fivetran, Financial Modeling, Linear Algebra, Sprint Planning, Call Centers, Causal Inference

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring