Ceyhun Kerti, Developer in Istanbul, Turkey
Ceyhun is available for hire
Hire Ceyhun

Ceyhun Kerti

Verified Expert  in Engineering

Data Engineering Developer

Location
Istanbul, Turkey
Toptal Member Since
March 10, 2022

Ceyhun is a data engineer with experience in different industries and various technologies. His primary goal is to introduce the latest technology and methodologies to his customers with the minimum cost and maximum stability. Ceyhun enjoys working in interdisciplinary fields using hybrid technology stacks.

Portfolio

Spotify - Main
Data Engineering, Scala, Java, Python, Cloud, Apache Beam, Dataflow Programming...
Screen Seventeen
Python, Apache Airflow, PostgreSQL, Amazon S3 (AWS S3), Presto...
BBVA Garanti Bank
SQL, PL/SQL, Oracle Data Integrator (ODI), Data Pipelines, REST APIs, Postman...

Experience

Availability

Full-time

Preferred Environment

Python, SQL, ETL, Data Engineering

The most amazing...

...project I've developed is a score calculation platform for sustainable development growth that represents a step forward for a better future for humanity.

Work Experience

Data Engineer

2022 - 2023
Spotify - Main
  • Developed an ad tracking system that measures how much of an ad is listened to by the unique user, how many ads are listened to in the content, and calculated related metrics and impressions used by downstream systems like finance and reporting.
  • Fixed bugs in the system and worked on usual small-scale tasks on the sprint.
  • Prepared RFCs and reference implementations to improve the stability and reliability of the pipelines.
Technologies: Data Engineering, Scala, Java, Python, Cloud, Apache Beam, Dataflow Programming, Google BigQuery, Luigi, Druid.io, Data Pipelines, REST APIs, Postman, Data Integration, API Integration, Apache Spark, Kubernetes

Senior Data Engineer Consultant

2021 - 2022
Screen Seventeen
  • Developed ETL pipelines from different data providers to AWS S3.
  • Created an SDK for the data science team so they can easily access and wrangle data from different sources.
  • Maintained ETL pipelines in Apache Airflow and helped the data science team to package their flows.
Technologies: Python, Apache Airflow, PostgreSQL, Amazon S3 (AWS S3), Presto, Amazon Web Services (AWS), Data Pipelines, Flask, REST APIs, Postman, Data Integration, Prefect, API Integration, Apache Spark, Kubernetes

Senior Data Engineer Consultant

2020 - 2021
BBVA Garanti Bank
  • Created and implemented data pipelines for loan risk reports.
  • Implemented a couple of fast-track projects related to customer interactions.
  • Analyzed the underlying infrastructure and existing pipelines and created possible restructuring guidelines.
Technologies: SQL, PL/SQL, Oracle Data Integrator (ODI), Data Pipelines, REST APIs, Postman, Data Integration

Senior Data Engineer Consultant

2020 - 2021
Fiat
  • Created an ETL pipeline between IoT source systems and Azure Blob Storage.
  • Identified problems in the existing ETL pipelines and developed and maintained optimized data pipelines.
  • Processed unstructured data in Azure with Spark and created structured models.
  • Conducted training sessions for employees of the company.
Technologies: Python, Azure, Databricks, Delta Lake, Cassandra, PySpark, Azure Databricks, Data Pipelines, REST APIs, Postman, Data Integration, Apache Spark

Full-stack Engineer Consultant

2020 - 2021
ING Bank
  • Implemented a web UI for the internal lending grid optimization tool.
  • Maintained and packaged the application to be easily deployed to different platforms.
  • Made the demo and showcase of the application for different company branches in other countries.
Technologies: JavaScript, Python, Flask, Pandas, Data Pipelines, REST APIs, Postman, Data Integration, API Integration, Apache Spark

Senior Data Engineer Consultant

2020 - 2021
Medical Park
  • Developed ETL pipelines for real-time reporting and BI infrastructure.
  • Made implementation data transfer modules between SQL Server and PostgreSQL.
  • Implemented data ingestion and real-time reporting features in KSQL/Kafka.
Technologies: Python, ETL, PostgreSQL, Oracle, SQL, MySQL, Data Pipelines, REST APIs, Postman, Data Integration, API Integration

Senior Data Engineer Consultant

2019 - 2021
ING Bank
  • Developed real-time ETL pipelines for the client.
  • Created dynamic AI model visualization infrastructure in Python with Plotly.
  • Helped to maintain and enhance existing ETL pipelines between the core system and Spark cluster.
Technologies: Python, SQL, ETL, Apache Airflow, Plotly, Data Pipelines, REST APIs, Postman, Data Integration, API Integration, Apache Spark

Senior Data Engineer Consultant

2016 - 2021
CK Energy
  • Implemented high-performance ETL pipelines for the company and its subsidiaries.
  • Managed an outsourced team for the business intelligence and data warehouse tasks.
  • Created a web-based management software for the ETL pipeline manager so that the company can manage all the flow once for all subsidiaries.
  • Optimized existing ETL pipelines and reduced the total ETL time.
Technologies: ETL, SQL, Oracle Data Integrator (ODI), Scala, JavaScript, MySQL, Data Pipelines, REST APIs, Postman, Data Integration

Senior Data Engineer Consultant

2016 - 2021
Aksigorta
  • Developed highly scalable ETL pipelines for the client in SQL, Python, and Oracle Data Integrator (ODI).
  • Made the transition from old ETL pipelines to new data integrator.
  • Involved in preprocessing and masking sensitive healthcare data of the customers.
  • Implemented highly optimized data extraction utility.
  • Provided training sessions in SQL, ODI, and Python to various departments in the company.
  • Optimized the current ETL pipelines and reduced the total ETL time.
  • Implemented a near-realtime ETL pipeline so the company and its agencies can see the reports and insights with a 2-minute delay.
Technologies: SQL, ETL, Oracle Data Integrator (ODI), Apache Airflow, Python, Apache Sqoop, MapR, MySQL, Data Pipelines, Flask, REST APIs, Postman, Data Integration, API Integration, Apache Spark

Senior Data Engineer Consultant

2016 - 2019
Dicle Energy
  • Developed ETL pipelines in SQL and Oracle Data Integrator (ODI).
  • Managed and trained a team of between 10 and 15 people.
  • Implemented highly optimized data extraction utility.
Technologies: SQL, Python, Data Engineering, ETL, Oracle Data Integrator (ODI), MySQL, Data Pipelines, REST APIs, Postman, Data Integration

Data Engineer

2006 - 2016
Oracle
  • Developed highly scalable ETL pipelines with big data for different companies in the telecommunication industry, including Vodafone, Turkish Telecom, Avea, and Turkcell.
  • Created both internal and client-facing applications.
  • Mentored junior developers to enjoy software development and produce high-quality applications.
Technologies: Python, SQL, Java, Scala, ETL, Business Intelligence (BI), Data Warehouse Design, Data Warehousing, MySQL, Data Pipelines, Flask, REST APIs, Postman, Data Integration, API Integration, Apache Spark, Data Analysis

Cross-Platform Plugin-Based Data Transfer Tool

https://github.com/bluecolor/tractor
An open-source data ingestion tool for transferring data from and to various systems. It is an entirely plugin-based tool with concurrent design. It can run natively in the operating system without requiring additional run time, and it can be managed with CLI and a web app or REST API.

Online Data Redaction Platform

https://github.com/bluecolor/redact
A platform that discovers and masks or redacts the data in different database management systems such as Oracle, MySQL, and SQL Server. It is designed for companies to comply with GDPR requirements. It has a Python back end and web UI to manage the application.

Task Scheduler

https://bluecolor.github.io/octopus/
A concurrent task scheduler and workflow platform that is easy to install and manage. It is horizontally scalable with a strong dependency management engine. Its design dates back to a time when airflow was not in the scene and was not as popular as it is today.
2001 - 2006

Bachelor's Degree in Computer Science Engineering

Yeditepe University - Istanbul, Turkey

Libraries/APIs

Pandas, REST APIs, Luigi, PySpark

Tools

Apache Airflow, Postman, Plotly, Apache Sqoop, AWS CloudFormation, Apache Beam

Frameworks

Flask, Apache Spark, Presto, Spark, Ruby on Rails (RoR)

Languages

Python, Go, JavaScript, SQL, Snowflake, Java, Scala, Bash Script

Paradigms

ETL, ETL Implementation & Design, Data Science, Dataflow Programming, Business Intelligence (BI)

Platforms

Oracle, Oracle Data Integrator (ODI), Amazon Web Services (AWS), Kubernetes, Databricks, MapR, Azure

Storage

Databases, PL/SQL, Druid.io, MySQL, Data Pipelines, Data Integration, PostgreSQL, Amazon S3 (AWS S3), Cassandra

Other

Data Engineering, Scripting, Google BigQuery, Data Build Tool (dbt), API Integration, Data Analysis, Prefect, Delta Lake, Cloud, Azure Databricks, Data Warehouse Design, Data Warehousing

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring