Tomasz Zielański, Developer in Katowice, Poland
Tomasz is available for hire
Hire Tomasz

Tomasz Zielański

Verified Expert  in Engineering

Data Engineer and Database Developer

Location
Katowice, Poland
Toptal Member Since
January 28, 2022

Tomasz is a data engineer with five years of industry experience. He specializes in engineering data pipelines, and he has significant experience in reporting and using machine learning for analytics in banking and fintech. Tomasz excels at using data to solve business challenges, and he often finds that data engineering is the most significant part of a data solution.

Portfolio

Stermedia
Python 3, Scikit-learn, Pandas, Python, Machine Learning, SQL, Statistics...
Bluesoft
Python 3, Amazon Web Services (AWS), Apache Airflow, Apache Spark...
Vodeno
Python 3, BigQuery, BigTable, Google Cloud, Java 8, Kotlin, Cloud Dataflow...

Experience

Availability

Part-time

Preferred Environment

Python 3, Google Cloud, Amazon Web Services (AWS), Scikit-learn, Apache Airflow, Apache Spark

The most amazing...

...project I've co-developed was a data lake for the very first entirely Google Cloud-based fintech in Europe.

Work Experience

Data Scientist

2021 - PRESENT
Stermedia
  • Developed a machine learning model for startup classification and integrated it with a web application.
  • Conducted technical consultations for prospective clients.
  • Prepared work estimations for machine learning projects for prospective clients.
Technologies: Python 3, Scikit-learn, Pandas, Python, Machine Learning, SQL, Statistics, Data Cleansing, Data Transformation, Jupyter, GitHub

Data Engineer

2021 - 2021
Bluesoft
  • Developed a cross-account access solution for a distributed AWS Data Lake architecture.
  • Optimized an Apache Spark pipeline to speed up the processing of 1GB+ datasets.
  • Added new functions to an on-demand data pipeline created on AWS EMR.
Technologies: Python 3, Amazon Web Services (AWS), Apache Airflow, Apache Spark, Amazon Elastic MapReduce (EMR), AWS Glue, Amazon Athena, Amazon S3 (AWS S3), Amazon EC2, Python, Data Lakes, Data Pipelines, Data Engineering, SQL, Data Transformation, REST APIs, APIs, Linux, ETL, GitHub

Data Engineer

2018 - 2021
Vodeno
  • Took part in designing a GCP data lake from scratch, including table modeling, choosing storage technology, and modeling guidelines.
  • Implemented a streaming data pipeline solution in Java 8, Kotlin, and Google Cloud Dataflow (Apache Beam).
  • Designed pipelines for BI reporting, including modeling data structures, implementing data pipelines, and creating dashboards in Google Data Studio.
  • Developed a propensity-to-buy machine learning model.
Technologies: Python 3, BigQuery, BigTable, Google Cloud, Java 8, Kotlin, Cloud Dataflow, Google Data Studio, Scikit-learn, Python, Java, Machine Learning, Google Cloud Platform (GCP), Data Lakes, Data Pipelines, Apache Beam, BI Reporting, Dashboards, Data Structures, Data Engineering, SQL, Data Warehousing, Predictive Modeling, Dashboard Design, Statistics, Data Cleansing, Data Transformation, Jupyter, Linux, ETL, GitHub

ETL Developer

2016 - 2018
ING Bank Śląski
  • Built multiple data pipelines in IBM Infosphere DataStage.
  • Designed Snowflake structures for a data warehouse on Oracle.
  • Maintained an Oracle data warehouse and solved bugs in collaboration with business teams.
Technologies: Oracle, IBM InfoSphere (DataStage), erwin Data Modeler, ETL, Data Pipelines, Snowflake, Data Warehousing, SQL, Data Structures, Data Engineering, Data Cleansing, Data Transformation, Jupyter, Linux, GitHub

Vodeno Cloud Platform for Banking

https://www.vodeno.com/#platform
A bank-in-the-box platform based on Google Cloud and the very first bank in Europe to be hosted solely on GCP. I took part in creating the data lake services from scratch until they were fully operational in production. My focus areas included architecture design, choice of storage technology, implementation of the streaming ETL pipeline framework, reporting dashboard design, and predictive modeling.

Languages

Python 3, SQL, Python, Java 8, Kotlin, Java, Snowflake

Tools

Jupyter, BigQuery, GitHub, Apache Airflow, Amazon Elastic MapReduce (EMR), AWS Glue, Amazon Athena, Cloud Dataflow, IBM InfoSphere (DataStage), Apache Beam

Paradigms

ETL

Other

Data Transformation, Data Cleansing, Software Development, Machine Learning, Google Data Studio, Artificial Intelligence (AI), Data Engineering, APIs, erwin Data Modeler, Deep Learning, BI Reporting, Dashboards, Data Structures, Data Warehousing, Architecture, Dashboard Design, Predictive Modeling, Statistics

Libraries/APIs

Scikit-learn, Pandas, REST APIs

Platforms

Amazon Web Services (AWS), Linux, Amazon EC2, Oracle, Google Cloud Platform (GCP)

Storage

BigTable, Google Cloud, Amazon S3 (AWS S3), Data Lakes, Data Pipelines

Frameworks

Apache Spark

2011 - 2016

Master's Degree in Automatics and Robotics

Silesian University of Technology - Gliwice, Poland

NOVEMBER 2021 - PRESENT

Deep Learning

DeepLearning.ai

APRIL 2021 - PRESENT

AWS Cloud Practitioner Essentials

Amazon Web Services

MARCH 2020 - PRESENT

Machine Learning

Stanford University

SEPTEMBER 2016 - PRESENT

Certificate of Proficiency in English

University of Cambridge

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring