Marcin Szymanski, Developer in London, United Kingdom
Marcin is available for hire
Hire Marcin

Marcin Szymanski

Verified Expert  in Engineering

Data Engineer and Developer

London, United Kingdom

Toptal member since June 18, 2020

Bio

With 15 years of experience in the data space across multiple industries and continents, Marcin has a wide range of experience in building and managing data teams and creating and maintaining scalable and innovative data platforms. Experienced in data engineering, big data, ETL, BI, streaming, and Python, Marcin is always looking for the most cutting-edge solutions. He is passionate about programming, loves and contributes to open-source software, and thrives in the world of data.

Portfolio

Flatfair
Amazon Web Services (AWS), Looker, Python, Snowflake, Apache Airflow
Nor1
Amazon Web Services (AWS), Spark, SAS, Redshift, Apache Airflow
Yoyo Wallet
Amazon Web Services (AWS), Python, Apache Kafka, Spark, Redshift, Apache Airflow

Experience

  • SQL - 13 years
  • Data Engineering - 13 years
  • ETL - 12 years
  • Python - 6 years
  • Apache Airflow - 3 years
  • Redshift - 3 years
  • Apache Spark - 3 years
  • Hadoop - 3 years

Availability

Part-time

Preferred Environment

Docker, Linux, Git, IntelliJ IDEA

The most amazing...

...pieces of code I've written will still be powering management information systems at leading Polish financial institutions in 2025.

Work Experience

Senior Data Engineer

2019 - 2020
Flatfair
  • Designed the data platform and selected vendors and tools for all components: data warehouse, ETL, and BI.
  • Single-handedly delivered a complete solution comprising Snowflake, Looker, and Airflow on Kubernetes in only four months,.
Technologies: Amazon Web Services (AWS), Looker, Python, Snowflake, Apache Airflow

Senior BI Architect | Developer

2017 - 2020
Nor1
  • Designed and developed a scalable, cost-efficient ETL framework using Python, Apache Airflow, and Amazon Redshift.
  • Defined the software delivery process for the data area.
  • Supervised the ETL development by a remote team based in India.
Technologies: Amazon Web Services (AWS), Spark, SAS, Redshift, Apache Airflow

Data Engineer

2018 - 2019
Yoyo Wallet
  • Based on past experience, within only a few weeks, implemented a highly efficient and innovative ETL framework using Python, Apache Airflow and Amazon Redshift.
  • Implemented an innovative real-time customer segmentation engine based on Spark Structured Streaming.
Technologies: Amazon Web Services (AWS), Python, Apache Kafka, Spark, Redshift, Apache Airflow

BI Data Engineer

2017 - 2018
Kindred Group
  • Built high-performant Spark real-time bet events processing engine.
  • Rearchitected several ETL processes for an Oracle-based data warehouse.
Technologies: Apache Kafka, Python, Oracle, Hadoop, Spark

Assistant Director | Product Consultant

2016 - 2017
Moody's Analytics
  • Designed an R language integration for the company’s new flagship credit assessment product.
  • Developed a new SLA monitoring module for the company’s current credit risk management product.
  • Developed various R data migration and processing tools.
  • Provided R training and guidance to other team members.
Technologies: R

MIS Development Team Manager

2014 - 2016
PKO Ubezpieczenia
  • Replaced a Oracle Warehouse Builder ETL with a Oracle Data Integrator version 12.1.3 and migrated 200 ETL processes within six months with no business disruption; overcame software bugs by escalating these to the highest global levels.
  • Oversaw projects of up to 800 to 1,000 man-days per year.
  • Replaced an ETL at PKO Ubezpieczenia (requiring 800 man-days) and implemented the initial phase of Nordea Life Poland data warehouse (requiring 1,000 man-days).
  • Performed data governance at PKO Ubezpieczenia (600 man-days); implementing IBM InfoSphere Information Server to meet regulatory requirements for data quality.
  • Managed a little less than 40 data warehouse releases, standardizing processes to enable new business functionalities every 6 weeks.
  • Implemented an IBM Cognos BI platform for a newly setup general insurance company.
Technologies: R, Java, Oracle Data Integrator (ODI), Oracle

MIS Development Team Manager

2012 - 2014
Nordea Life & Pensions Poland
  • Improved the software delivery and enabled forward planning in line with business expectations through the creation of corporate data warehouse development strategies for both companies.
  • Expanded a life insurance data warehouse to cover all the business areas through creation of new data structures and ETL processes to allow for increased user numbers and data availability.
  • Reduced Nordea Life Poland’s end-of-month processing time of commission calculations by 95%, from 80 hours to just two through Oracle database optimization—achieving what a major Polish software house had failed to do in 15 months.
Technologies: Java, SAS, Oracle

IT Analyst | Sales Analyst

2010 - 2012
Nordea Life Poland
  • Led an IT workflow and a life insurance data warehouse implementation (ETL, data repository, business intelligence, and hardware).
  • Defined complex accounting methodology to calculate insurance sales and implemented using advanced SQL.
  • Acted as a business analyst for a funds and finance workflow of a custom life insurance core system implementation.
Technologies: SAS, Oracle

Analyst

2008 - 2010
Accenture
  • Contributed, as the lead SAS developer, to the effective implementation of a new data warehouse at Poland’s second-largest bank.
  • Developed key components for ETL processes and planning and controlling engine using SAS Data Integration Studio.
  • Assisted three colleagues in gaining SAS 4GL proficiency through detailed guidance and coaching.
Technologies: SAS

Intern

2007 - 2008
Accenture
  • Worked as a business analyst in the design of after-sales processes for a greenfield consumer finance company launched by a major European financial group.
  • Oversaw remotely the activities of a Romanian-based team of four Oracle developers in building up the core system interfaces—reviewing and making corrections as required.
Technologies: Oracle

RGoogleFit

https://cran.r-project.org/web/packages/RGoogleFit/index.html
This is a CRAN package—a R interface to Google Fit.

SAS Enterprise Guide—PROC LIFEREG

https://github.com/ms32035/RGoogleFit
This is a SAS Enterprise Guide extension for PROC LIFEREG and a whole framework to write EG extensions.

Airflow DAG dependencies

https://github.com/ms32035/airflow-dag-dependencies
One of the most popular Apache Airflow plugins
2012 - 2013

Post-master Program in Statistical Methods in Business with SAS

University of Warsaw - Warsaw, Poland

2008 - 2008

Partcipated in a Non-degree Exchange Program in Business Administration

UCLA Anderson School of Management - Los Angeles, CA, USA

2003 - 2008

Master of Arts Degree in Quantitative Methods in Economics and Information Systems

SGH Warsaw School of Economics - Warsaw, Poland

Tools

Apache Airflow, Looker, Terraform

Languages

SQL, SAS, Python, Java, Snowflake, R

Paradigms

ETL

Platforms

Oracle, Oracle Data Integrator 12c, Azure, Kubernetes, Docker, Linux, Amazon Web Services (AWS), Oracle Data Integrator (ODI), Apache Kafka

Storage

Oracle PL/SQL, Redshift, PostgreSQL, Elasticsearch

Frameworks

Spark, Hadoop, Apache Spark

Other

Data Engineering, Statistics

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring