Marcin Suszczyński, Developer in Warsaw, Poland
Marcin is available for hire
Hire Marcin

Marcin Suszczyński

Verified Expert  in Engineering

Data Warehouse Developer

Location
Warsaw, Poland
Toptal Member Since
June 18, 2020

Marcin is an IT professional with 15+ years of experience in multinational work environments across different industries, using Agile and Waterfall methods. He specializes in data integration, utilizing market-leading ETL tools and open source and custom solutions. Skilled in very large data warehouse environments based on Kimball and Data Vault 2.0 methodology, Marcin has also worked in data integration projects, including semantic web data models.

Portfolio

IT house specialized in financial sector
Azure, Python, OpenAI, FastAPI, Infrastructure as Code (IaC), Azure App Service...
US Digital Marketing Company
Python, ETL, Azure SQL, Machine Learning Operations (MLOps)...
Self-employed
Windows PowerShell, Microsoft SQL Server, Netezza, Informatica PowerCenter...

Experience

Availability

Full-time

Preferred Environment

Informatica PowerCenter, Python, Netezza, SQL, Data Engineering, Snowflake, Data Build Tool (dbt), Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP)

The most amazing...

...project where I’ve played a central role in creating an AI chatbot that enhances communication and productivity for MS Teams users for one of my clients.

Work Experience

Senior Data Engineer

2023 - 2024
IT house specialized in financial sector
  • Played a central role in creating an intelligent chatbot that seamlessly interacts with Microsoft Teams channels. It was an app written in Python using FastAPI hosted as a standalone web app in Azure App Service.
  • Employed the infrastructure as code (IaC) approach to manage our Azure resources efficiently. Our IaC templates were automatically deployed and maintained through Azure DevOps pipelines, streamlining the process from development to production.
  • Integrated LangChain into the chatbot, significantly improving natural language understanding and communication capabilities. This enhancement allowed for more context-aware and accurate responses to user queries.
  • Utilized the Azure bot framework SDK to seamlessly integrate conversational AI capabilities into Microsoft Teams channels. Developed chatbot features, including natural language understanding, user interactions, and context-aware responses.
Technologies: Azure, Python, OpenAI, FastAPI, Infrastructure as Code (IaC), Azure App Service, Azure Bot Framework, LangChain, Data Pipelines, Azure Functions, Microsoft Azure, Azure Synapse

Azure Data Engineer

2023 - 2023
US Digital Marketing Company
  • Assisted the client in evaluating and replacing their existing data pipelines built using a typical Data Science stack in Python hosted on Azure MLOps.
  • Developed a working proof-of-concept solution with a central component, a data pipeline using the DBT, orchestrated through Apache Airflow.
  • Collaborated with the client's team to ensure a seamless transition and improved data processing efficiency.
Technologies: Python, ETL, Azure SQL, Machine Learning Operations (MLOps), Data Build Tool (dbt), Azure Functions, Apache Airflow, Data Pipelines, Microsoft Azure

Senior ETL and DWH Engineer

2018 - 2023
Self-employed
  • Created a PowerShell tool for automatically generating ETL mappings and workflows for Informatica PowerCenter.
  • Converted ETL mappings and workflows to seamlessly transition from Microsoft SQL Server to Netezza technology for 50+ Scandinavian banks.
  • Developed a Python tool to speed up the data reconciliation process between complex Microsoft SQL Server and Netezza views.
  • Architected ETL pipelines in Azure Data Factory, loading data from different types of source systems (files, Azure SQL databases, and Azure Synapse).
  • Developed Azure Databricks notebooks using SQL and Python, loading data from Parquet files, enriching them, and storing them in Delta Lake.
Technologies: Windows PowerShell, Microsoft SQL Server, Netezza, Informatica PowerCenter, Data Warehousing, Azure Data Factory, Azure Databricks, Data Vaults, Python, SQL, Azure, Data Engineering, User-defined Functions (UDF), Stored Procedure, Microsoft Azure, Data Architecture, Databricks, Data Modeling, APIs, Data Pipelines, Azure Functions, Azure Synapse, Informatica ETL

Senior ETL Developer

2017 - 2018
First Data
  • Created new ETL processes using Oracle Data Integrator.
  • Introduced a Git workflow with the usage of the Flyway tool to improve release to the production process.
  • Trained team members on Git and Flyway and how to leverage that for increased productivity.
Technologies: Flyway, Git, Oracle Data Integrator (ODI), Oracle, Data Warehousing, Kimball Methodology, SQL, Data Engineering, User-defined Functions (UDF), Stored Procedure, Data Architecture, Data Modeling, Data Pipelines

Senior ETL Developer and Data Modeler

2015 - 2017
IT Kontrakt
  • Contributed to a data warehouse project for the pharmaceutical industry. Contributed to the signal medical data integration (ETL) from cloud sources (Salesforce) and data mining DB (Oracle Empirica) to a unified data warehouse. Authored the data model and ETL architecture.
  • Integrated multiple relational systems with a triplestore, purpose-built database for an integrated document management system. Served as an RDF model contributor and ETL developer.
  • Created the web application for tracking different KPIs in a pharmaceutical company. Authored a data model and worked as a database developer.
Technologies: Talend, Linux, Informatica PowerCenter, Oracle, Data Warehousing, Kimball Methodology, SQL, Data Engineering, User-defined Functions (UDF), Stored Procedure, Data Architecture, Data Modeling, Data Pipelines

ETL Specialist and Data Modeler

2014 - 2015
Ascen
  • Completed the real-time integration of flight and passenger data including data load and transformations from complex XML sources to a common multi-dimensional model I authored.
  • Developed custom C# transformations to improve the performance of Microsoft SSIS pipelines.
  • Integrated booking data to a common multi-dimensional booking model I authored.
  • Created an SAP BO data foundation and business layer for reporting passenger and flight data.
  • Contributed to a data integration project for the aviation market. Integrated data from multiple operational sources (relational, XML, WebServices, and IATA TTY) into a single central operational database to support the flight planning management system.
Technologies: SAP BusinessObjects Data Service (BODS), Microsoft SQL Server, SQL Server Integration Services (SSIS), Data Warehousing, SQL, Data Engineering, User-defined Functions (UDF), Stored Procedure, Data Architecture, Data Modeling, Data Pipelines, C#

ETL Specialist

2010 - 2014
Hewlett Packard
  • Served as a part of 3rd line support team I was responsible for supporting an Enterprise Data Warehouse for the large customer, one of the world FMCG leaders.
  • Resolved incidents in daily batch processing for the data warehouse.
  • Optimized performance of long running queries.
  • Completed code reviews of new projects coming to the platform.
  • Extended a platform with new ETL processes.
  • Trained new team members, for near- and offshore locations.
Technologies: Oracle Application Express (APEX), Control-M, Bash, Informatica PowerCenter, Oracle, Data Warehousing, SQL, Data Engineering, User-defined Functions (UDF), Stored Procedure, Data Architecture, Data Pipelines

Junior ETL Specialist

2008 - 2010
Hewlett Packard
  • Supported an enterprise data warehouse for the large customer, one of the world FMCG leaders.
  • Analyzed data flows, found potential data quality issues, and recommended solutions for fixing them.
  • Managed knowledge transfer to the offshore location.
Technologies: Control-M, Bash, Oracle, Data Warehousing, SQL, Data Engineering, User-defined Functions (UDF), Stored Procedure, Data Pipelines

USOS BIRT Connector

I created a library that allows the integration of the University Study-Oriented System (USOS), a student management information system used in 50 Polish universities, and the Business Intelligence Report tool (BIRT) to enable the transformation from an old system based on Oracle Reports to a modern BI tool.
2008 - 2011

Master of Science Degree in Informatics

University of Warsaw - Warsaw, Poland

2005 - 2008

Bachelor of Science Degree in Computer Science

Adam Mickiewicz University - Poznan, Poland

JANUARY 2024 - PRESENT

Microsoft Certified: Azure Fundamentals

Microsoft

SEPTEMBER 2022 - PRESENT

Data Vault 2.0

Data Vault Alliance

JANUARY 2014 - PRESENT

Oracle Database 11g Administrator Certified Professional (OCP)

Oracle

OCTOBER 2012 - PRESENT

Oracle Advanced PL/SQL Developer Certified Professional (OCP)

Oracle

JANUARY 2012 - PRESENT

Oracle Application Express Developer Certified Expert (OCE)

Oracle

FEBRUARY 2010 - PRESENT

Oracle Database SQL Certified Expert (OCE)

Oracle

OCTOBER 2008 - PRESENT

ITIL V3 Foundation

ITIL Certified

Libraries/APIs

Pandas

Tools

Informatica PowerCenter, Informatica ETL, Git, Oracle SQL Data Modeler, Control-M, Oracle Application Express (APEX), Flyway, Apache Airflow, Azure App Service

Languages

SQL, Bash Script, Stored Procedure, XML, Python, Bash, C#, Java, Snowflake, C++

Platforms

Oracle, Linux, Oracle Data Integrator (ODI), Azure Synapse, Azure, Unix, Talend, Databricks, Amazon Web Services (AWS), Google Cloud Platform (GCP), Azure Functions, BIRT

Paradigms

Database Design, Business Intelligence (BI), ETL, Kimball Methodology, Scrum

Storage

Netezza, Oracle RDBMS, SQL Server Integration Services (SSIS), PL/SQL, Database Modeling, Databases, Data Pipelines, PostgreSQL, Microsoft SQL Server, DBeaver, Azure SQL, Azure SQL Databases

Frameworks

Windows PowerShell, Azure Bot Framework

Other

Data Warehousing, Data Warehouse Design, Data Engineering, User-defined Functions (UDF), Data Architecture, Data Modeling, Oracle Performance Tuning, Fintech, FMCG, Azure Data Factory, Data Vaults, Microsoft Azure, SAP BusinessObjects Data Service (BODS), Azure Databricks, APIs, Data Build Tool (dbt), Machine Learning Operations (MLOps), OpenAI, FastAPI, Infrastructure as Code (IaC), LangChain, DLL

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring