Kirill Chulkov, Developer in Warsaw, Poland
Kirill is available for hire
Hire Kirill

Kirill Chulkov

Verified Expert  in Engineering

Data Engineer and Developer

Warsaw, Poland

Toptal member since May 11, 2022

Bio

Kirill is a data engineer and ETL developer with six years of experience developing big data pipelines using Apache Spark, Teradata, Apache Hive, Apache Airflow, and various SAS products. He is proficient in gathering requirements from stakeholders, analyzing data sources, and building complicated processes from scratch. Kirill has always been attentive to details, customer-oriented, and focused on results.

Portfolio

Sberbank
SQL, Teradata, Apache Hive, Apache Airflow, Oozie, Greenplum, Scala
GlowByte Consulting
SQL, Teradata, SAS, Hadoop, Apache Hive, Lua
Entum ERP
SAP Controlling (CO), ABAP

Experience

  • Teradata - 6 years
  • SQL - 6 years
  • SAS Data Integration (DI) Studio - 4 years
  • Hadoop - 4 years
  • Apache Hive - 4 years
  • Apache Spark - 3 years
  • Agile - 2 years
  • Python - 2 years

Availability

Part-time

Preferred Environment

Windows, IntelliJ IDEA, PyCharm, Teradata SQL Assistant

The most amazing...

...things I've developed are pipelines for massive personalization of the Sberbank app of 70 million monthly unique users using Teradata, Spark, and Airflow.

Work Experience

Senior Data Engineer

2020 - 2022
Sberbank
  • Designed, developed, implemented, and supported new and existing data integration jobs using Teradata, Spark, Greenplum, and Airflow for mass personalization of the Sber online app.
  • Participated in the migration from Teradata to Hadoop using Hive, Spark, Java, and Scala.
  • Produced ad hoc reports to answer business questions.
  • Troubleshot, diagnosed, and resolved data quality and performance issues.
  • Gathered the requirements from stakeholders.
Technologies: SQL, Teradata, Apache Hive, Apache Airflow, Oozie, Greenplum, Scala

Senior ETL Developer

2016 - 2020
GlowByte Consulting
  • Performed various ETL processes, including developing the main tool SAS DI, building databases with Teradata, Oracle, and Impala, and programming using SAS Base, SAS Macros, and SQL.
  • Developed macros for RWA models' validation, such as the main tool SAS Decision Manager, and programmed using SAS Base, SAS Macros, and SQL.
  • Designed and developed infrastructure for RWA calculation and IFRS9 reports, including the main tool SAS ECL and databases with Teradata, Hive, and Impala.
  • Programmed with SAS Base, SAS Macros, SQL, and Lua and transferred data between Teradata and HDFS to Sqoop.
  • Conducted interviews, supervised the junior developers, and wrote the project documentation.
Technologies: SQL, Teradata, SAS, Hadoop, Apache Hive, Lua

Consultant SAP CO

2012 - 2016
Entum ERP
  • Supervised the full lifecycle of SAP CO implementation in the oil and gas industry.
  • Participated in integration with other modules, including FI, SD, MM, PS, and PM.
  • Maintained the settings of different components, including product cost controlling and profitability analysis.
  • Trained the key users and customer support groups.
  • Led the working group in developing technical specifications and wrote the project ducumentation.
Technologies: SAP Controlling (CO), ABAP

Customers' Payments and Money Transfers Prediction for Sberbank Online App

The front-end team provided customers with quick access buttons for their most likely money transfers or payments. My team's task was to prepare the data with the predictions.

As a data engineer, I developed and maintained the pipelines, prepared ML models' data, gathered requirements, analyzed data for new types of operations such as transfers abroad, troubleshot, and operated with billions of rows. The application had over 70 million unique customers per month.

Risk-weighted Assets Calculation for VTB

The risk-weighted assets were used to determine the minimum amount of regulatory capital banks must hold to maintain their solvency.

As a leading data engineer of the project, I oversaw more than 20 ETL jobs and several ML model evaluations. Technologies I used in the project included Teradata, SAS DI, SAS ECL, Oracle Database, Hive, Impala, and Sqoop.
2005 - 2010

Specialist Diploma in Information Security

National Research Nuclear University MEPhI - Moscow, Russia

JANUARY 2021 - PRESENT

Hadoop Ecosystem

SberUniversity

JANUARY 2021 - PRESENT

Apache Spark for Data Engineering Tasks

NewProLab

JULY 2019 - JULY 2021

CCA Spark and Hadoop Developer

Cloudera

OCTOBER 2018 - PRESENT

SAS Certified Base Programmer for SAS 9

SAS

Tools

Teradata SQL Assistant, SAS Data Integration (DI) Studio, IntelliJ IDEA, PyCharm, Oozie, Apache Airflow, Apache Impala, Apache Sqoop

Languages

SQL, SAS, Lua, Scala, Python

Storage

Teradata, Apache Hive, Databases, Greenplum, HBase

Frameworks

Apache Spark, Hadoop, Spark Structured Streaming

Paradigms

ETL, Agile, MapReduce

Platforms

Oracle Database

Other

Data Engineering, Base SAS, SAS ECL, Large Data Sets

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring