Pavlo M Kurochka, Developer in Walnut Creek, CA, United States
Pavlo is available for hire
Hire Pavlo

Pavlo M Kurochka

Verified Expert  in Engineering

Data Migration Developer

Location
Walnut Creek, CA, United States
Toptal Member Since
June 18, 2020

Pavlo is an experienced professional who has designed, developed, and deployed data migrations and integrations for dozens of successful projects of different sizes. He knows how many ERP, CRM, PLM, billing, and document management systems work inside and utilizes his experience using the latest tools and technologies. Pavlo gets along great with colleagues—once people get to know him, they want him on their team.

Portfolio

Self-employed
Syniti Data Stewardship Platform (DSP), PL/SQL, SQLAlchemy, CSV File Processing...
BackOffice Associates
PL/SQL, CSV File Processing, SAP LSMW Data Migrations...

Experience

Availability

Full-time

Preferred Environment

PostgreSQL, Microsoft SQL Server, MySQL, Toad, Visual Studio Code (VS Code), Python, Oracle, Excel 365, DuckDB, DBeaver

The most amazing...

...thing I've done was GE BOM's interface, distilling 1.5 million records to 3-500 changes. The client used it for two years, far longer than planned.

Work Experience

Data Engineer

2018 - PRESENT
Self-employed
  • Developed a set of Python programs, automating the download of billing and pricing information from vendor APIs and transforming them into monthly Excel reports. Maintained history and reference data in MySQL database.
  • Converted data into Siemens Teamcenter PLM using Oracle database, PL/SQL, and Python scripts.
  • Converted data into SAP ERP using Synity DSP and MS SQL.
Technologies: Syniti Data Stewardship Platform (DSP), PL/SQL, SQLAlchemy, CSV File Processing, Data Migration, Data-driven Testing, Notepad++, Jupyter Notebook, Data Migration Testing, QuickBooks API, Google Drive API, Oracle ERP, Data Integration, Microsoft Excel, Oracle PL/SQL, Teamcenter, Visual Studio Code (VS Code), Twilio API, SAP, PostgreSQL, MySQL, SQL Server 2014, Oracle, SQL, Pandas, Python 3, ETL, Star Schema, Data Engineering, Data Extraction, Microsoft Access, Oracle EBS, Microsoft SQL Server, ETL Implementation & Design, APIs, Amazon Web Services (AWS), Scripting Languages, ETL Tools, DB, Apache Airflow, NoSQL, RESTful Microservices, Data Analysis

Senior Data Migration Consultant

2007 - 2018
BackOffice Associates
  • Performed successful data migrations for multiple systems implementation projects for the following BackOffice Associates clients, including St. Jude Medical (2007-2010), Avnet Technology Solutions (2010-2012), and Avnet Technology Solutions GmbH (2012-2013).
  • Completed the same data migrations for GE (2014-2015), Sysco (2015), LafargeHolcim (2016), The Kraft Heinz Company (2016), Municipality of Anchorage (2016-2017), Dimension Data (2017), Rockwell Collins (2017), Ferring Pharmaceuticals (2017).
  • Used target systems for these mentioned projects: SAP (ECC and CRM), Oracle EBS, Siemens Teamcenter PLM, and Microsoft Dynamics NAV.
  • Facilitated business requirement discussions, data mapping sessions, and cross-functional and technical meetings.
  • Profiled legacy and target data to support the development of transformation specifications. Extracted data from legacy systems into staging databases.
  • Designed, developed, and executed: data transformation and remediation rules, pre-load and post-load data validation reports, cross-reference tables for value mappings, and data construction web pages for data augmentation and creation.
  • Identified and utilized various mass load methods in target business systems to automate data imports, such as SAP LSMW, BDC, Oracle EBS Open Interfaces, and PL/SQL APIs.
  • Orchestrated and implemented data cleansing and de-duplication strategies. Developed system cutover plans.
  • Planned and executed multiple test data load cycles per project, usually 4 to 7, depending on project size and complexity; analyzed and resolved resulting issues and defects.
  • Established, implemented, and supported post-go-live data governance frameworks and processes. Managed onshore/offshore resources.
Technologies: PL/SQL, CSV File Processing, SAP LSMW Data Migrations, Syniti Data Stewardship Platform (DSP), Notepad++, SAP S/4HANA Cloud, SAP Materials Management (MM), SAP Sales and Distribution (SAP SD), SAP FICO, Data Migration Testing, BackOffice Cransoft, BackOffice DSP, Oracle ERP, Data Integration, Microsoft Excel, Oracle PL/SQL, SQL, MySQL, Teamcenter, Microsoft Dynamics NAV, Oracle Database, Salesforce, Siebel, IBM Cognos, Cognos 10, Oracle E-Business Suite (EBS), SAP, Microsoft SQL Server, ETL, Star Schema, Data Extraction, Oracle EBS, Data Engineering, CSV, ETL Implementation & Design, ETL Tools, DB, Data Analysis

Zoho CRM to Hubspot CRM Data Migration

https://www.useloom.com/share/25a3520e9ec946339d239daa67c43afb
This proof of concept project demonstrates the power and flexibility of the methodology and toolset that I use for data migrations.

Skills: ETL, Microsoft SQL Server, T-SQL, Python, Hubspot, Zoho-CRM, data migration

Please follow the link for a demo video.

Avnet SAP Data Migration Project

I developed and configured data conversion logic for 15 conversion objects into SAP CRM, SD, and FI.

I executed complex customer data conversion that involved integration with Satori Address Validation software and de-duplication in spreadsheets.

Final Project for the Data Engineering Zoomcamp by DataTalksClub

https://github.com/pavlokurochka/data-engineering-zoomcamp2024-project
I'm excited to share that I've completed the final project for the Data Engineering Zoomcamp by DataTalksClub! During this course, I had the opportunity to dive deep into various aspects of data engineering, including:

• Containerization and Infrastructure as Code: Learned about Docker, docker-compose, and setting up infrastructure on GCP with Terraform.

• Workflow Orchestration: Explored Data Lake and Workflow orchestration with Mage. Got inspired to research this field more extensively. Discovered, learned, and used in my project Kestra for orchestration.

• Data Warehouse: Gained insights into BigQuery, partitioning, clustering, and BigQuery best practices.

• Analytics Engineering: Delved into the basics of analytics engineering, dbt (data build tool), BigQuery and dbt, Postgres and dbt, dbt models, testing, documenting, deployment to the cloud and locally, and visualizing the data with Google Looker Studio and Metabase. Again, researched similar popular new technologies by myself and learned and started using DuckDB and SQLMesh.

• Batch Processing: Learned about batch processing and Spark Dataframes.
By applying these technologies, I was able to tackle my Capstone Project.

Lightweight Data Pipeline for Company of Heroes 3 Matches

https://github.com/pavlokurochka/data-engineering-zoomcamp2024-project2
I am thrilled to share my latest project: a streamlined approach to building data warehouses.
This open-source project, inspired by the Data Engineering Zoomcamp, showcases the power of a modern data engineering stack. All you need to install locally on your computer is Visual Studio Code. The rest, including cloud storage and computing resources, is free.
Simply create accounts on MotherDuck and GitHub and follow my detailed instructions. Within minutes, you'll have a robust and responsive data warehouse ready to handle your queries.
The example data pipeline transforms some online gaming data, but the described approach could be applied to any data source or subject matter.
Big thanks to Alexey Grigorev and the DataTalksClub team for putting together such an innovative and inspiring course.

Technologies used: Docker, Kestra, MotherDuck, SQLMesh, dlt, Python, SQL, Streamlit, DataEngineering, DEZoomcamp, dltHub, Tobiko Data

Proof of Concept Framework for Data Migration into Salesforce.com

https://github.com/pavlokurochka/data_migration_sfdc
A single Python script that works as a command line application (CLI). This script can achieve the same results that some companies' platforms are offering, for which they charge hundreds of thousands of dollars in software licenses just to enable the data migration process. This proof of concept could apply to any business system (CRM, ERP, PLM, etc.). It takes a publicly available data set of US companies maintained by a US government agency, transforms it, and loads it as Accounts into Salesforce.com. I wanted to showcase my ability to get up to speed with the load process and methods in modern business systems that are new to me.

It covers all the steps of a proper data migration project:
• Enabling source and target system discovery and establishing connectivity.
• Making source and target table snapshots locally.
• Data profiling and column mapping.
• Generating the code for data transformation rules that create a staging table
• Producing pre-load reports.
• Loading data into the target system while marking loaded records and capturing any load errors.
• Producing post-load reports with column-by-column comparisons between staged and loaded values.
1988 - 1993

Master's Degree in Economics, Management

Kyiv National Commerce and Economics University - Kyiv, Ukraine

APRIL 2024 - PRESENT

Data Engineering Zoomcamp

DataTalks Club

NOVEMBER 2023 - PRESENT

SQL (Advanced)

HackerRank

APRIL 2023 - PRESENT

Data Engineer

DataCamp

MAY 2020 - PRESENT

Google IT Automation with Python

Google

MARCH 2019 - PRESENT

6.00.1x: Introduction to Computer Science and Programming Using Python

MITx on edX

MARCH 2000 - PRESENT

Oracle Certified Applications Consultant, Procurement Release 11

Oracle

Libraries/APIs

Pandas, Twilio API, Google Drive API, QuickBooks API, SQLAlchemy

Tools

BackOffice Cransoft, Microsoft Excel, Oracle ERP, Oracle E-Business Suite (EBS), Microsoft Access, Notepad++, Apache Airflow, Toad, IBM Cognos, SAP Materials Management (MM), Microsoft Dynamics NAV, GitHub, Terraform, Mage, BigQuery

Languages

Python, SQL, T-SQL (Transact-SQL), Python 3

Paradigms

ETL, ETL Implementation & Design, Data-driven Testing, Management

Storage

Oracle PL/SQL, PostgreSQL, Microsoft SQL Server, Data Integration, DB, MySQL, SQLite, PL/SQL, SAP S/4HANA Cloud, SQL Server 2014, NoSQL, DBeaver

Platforms

Oracle Database, Visual Studio Code (VS Code), Jupyter Notebook, Amazon Web Services (AWS), Google Cloud Platform (GCP), Oracle, Salesforce, Docker, Kestra

Frameworks

SQLMesh, Streamlit

Other

BackOffice DSP, Syniti Data Stewardship Platform (DSP), CSV File Processing, SAP LSMW Data Migrations, Oracle EBS, Data Migration, Data Migration Testing, Star Schema, CSV, Excel 365, Data Engineering, Data Extraction, Scripting Languages, ETL Tools, Data Analysis, DuckDB, SAP FICO, SAP Sales and Distribution (SAP SD), APIs, RESTful Microservices, SAP, Teamcenter, Cognos 10, Siebel, Google BigQuery, Data Build Tool (dbt), Economics, MotherDuck, dltHub, duckdb, Data, Excel Expert, dlthub

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring