Yuriy Margulis, Developer in Los Angeles, CA, United States
Yuriy is available for hire
Hire Yuriy

Yuriy Margulis

Verified Expert  in Engineering

Big Data Developer

Location
Los Angeles, CA, United States
Toptal Member Since
June 18, 2020

Yuriy is a data specialist with over 15 years of experience in data warehousing, data engineering, feature engineering, big data, ETL/ELT, and business intelligence. As a big data architect and engineer, Yuriy specializes in AWS and Azure frameworks, Spark/PySpark, Databricks, Hive, Redshift, Snowflake, relational databases, tools like Fivetran, Airflow, DBT, Presto/Athena, and data DevOps frameworks and toolsets.

Portfolio

Databricks
Databricks
Paramount
Spark, PySpark, Scala, Databricks, Snowflake, Apache Airflow, SQL...
Crowd Consulting
Amazon Web Services (AWS), Data Warehouse Design, Data Warehousing...

Experience

Availability

Part-time

Preferred Environment

Amazon Web Services (AWS), Snowflake, Spark

The most amazing...

...project I've grown was a PriceGrabber data warehouse from five to 17 subjects through many platform changes. I wrote multiple lines of SQL and other scripting c

Work Experience

Specialist Solutions Architect

2023 - PRESENT
Databricks
  • Joined the Field Engineering, Communications, and Media and Entertainment verticals.
  • Provided technical support for field engineers, architects, and customers.
  • Performed data warehousing, data engineering, migrations, and integrations.
Technologies: Databricks

Senior Manager Data Engineering

2020 - 2023
Paramount
  • Built a revenue data mart and added a server-side subject area to the data lake.
  • Managed a team and oversaw ETL monitoring, optimization, and performance tuning.
  • Represented the data engineering team in the company's architecture guild activities.
Technologies: Spark, PySpark, Scala, Databricks, Snowflake, Apache Airflow, SQL, Amazon Web Services (AWS), Amazon Athena, Data Build Tool (dbt), Google BigQuery

Co-founder | CEO

2016 - 2023
Crowd Consulting
  • Worked on full data warehouse implementations for multiple clients.
  • Provided big data training and support for consulting partners.
  • Engineered and built an ETL pipeline for an AWS S3 data warehouse using AWS Kinesis, Lambda, Hive, Presto, and Spark. The pipeline was written in Python.
  • Delivered data warehouses, data lakes, data lakehouses, feature marts, BI systems, migrations, and integrations.
Technologies: Amazon Web Services (AWS), Data Warehouse Design, Data Warehousing, Amazon Athena, Tableau, Luigi, Scala, Python, Amazon S3 (AWS S3), Amazon DynamoDB, MySQL, PostgreSQL, Redshift, AWS Lambda, Apache Hive, Databricks, Spark, Hadoop, Amazon Elastic MapReduce (EMR)

Data Engineering Architect

2020 - 2020
CVS Health (via Toptal)
  • ETL and feature engineering - personalization engine.
Technologies: RAPIDS, Scala, Python, Spark, Databricks, Azure

Data Engineer

2020 - 2020
Maisonette
  • Built a data platform and data lake using Fivetran, dbt, and Databricks.
  • Participated in the development of a BI platform in Looker.
  • Performed CI/CD deployment and operational support.
Technologies: Amazon Web Services (AWS), Fivetran, Looker, Python, Apache Airflow, Snowflake, PostgreSQL

Data Engineer

2019 - 2020
Teespring (via Toptal)
  • Migrated a data warehouse ETL pipeline from Airflow/Redshift to Fivetran, Databricks, and Snowflake.
Technologies: Amazon Web Services (AWS), APIs, Redshift, Apache Airflow, Python, Spark, Snowflake, Databricks, Fivetran

Data Engineer

2018 - 2019
BCG GAMMA (via Toptal, Three Contracts)
  • Provided engineering support for data scientists.
  • Designed and built a featured engineering data mart and customer 360° data lake in AWS S3.
  • Designed and developed a dynamic S3-to-S3 ETL system in Spark and Hive.
  • Completed various DevOps tasks included an Airflow installation, development of Ansible playbooks, and history backloads.
  • Worked on a feature engineering project which involved Hortonworks, Spark, Python, Hive, and Airflow.
  • Built a one-on-one marketing feature engineering pipeline in PySpark on Microsoft Azure and Databricks (used ADF, ADL, Databricks Delta Lake, and ADW as a source).
Technologies: Ansible, Boto 3, Apache Airflow, PostgreSQL, Relational Database Services (RDS), AWS Glue, Amazon Athena, Presto, Apache Hive, Spark, Python

Vice President, Data

2017 - 2018
Enervee
  • Managed the data engineering, BI reporting, and data science teams.
  • Worked as a hands-on data engineer.
  • Built a data lake on AWS.
  • Developed a reporting system with Redash/Presto.
Technologies: Amazon Web Services (AWS), Redash, Apache Airflow, Python, Amazon S3 (AWS S3), Amazon Aurora, MySQL, PostgreSQL, Redshift, Apache Hive, Presto, Spark, Amazon Elastic MapReduce (EMR), Hadoop

Big Data Architect

2016 - 2017
ITG
  • Worked in a full-time position, as a data architect for a transaction cost analysis system.
  • Installed a four-node Apache Hadoop/Spark cluster on ITG's private cloud.
  • Conducted platform POC embedding Apache Spark technology into ITG's data platform.
  • Supported the development of a platform POC for Kx Kdb+; also converted Sybase IQ queries to Kdb+ Q language.
Technologies: Q, Kdb+, Informatica, Sybase, Python, Spark, Apache Hive, Hadoop

Data Engineer

2016 - 2017
American Taekwondo Association (via Toptal)
  • Converted data from a legacy Oracle database to a newly designed SQL Server database.
  • Wrote SQL scripts, stored procedures, kettle transformations.
  • Administered two databases.
  • Performed extensive data cleansing and validation.
Technologies: Pentaho, Oracle, SQL

Director, Data Warehouse

2015 - 2016
Connexity
  • Managed two data warehouses and BI teams for both PriceGrabber and Shopzilla. Connexity is also known as PriceGrabber, Shopzilla, and BizRate.
  • Handled operational support for the PriceGrabber data warehouse. Recovered data warehouse after the data center migration.
  • Merged one data warehouse into another and retired one of them. Hands-on designed business and data integration architecture; developed data validation scripts and ETL integration code. Managed the transfer of a BI reporting system from Cognos to OBIEE and Tableau.
  • Defined the technology platform change strategy for the combined data warehouse.
  • Created SQL: PL SQL stored procedures, packages, and anonymous scripts for ETL and data validation.
  • Completed an Amazon Redshift project.
  • Worked on and completed a Cloudera Impala project.
Technologies: Amazon Web Services (AWS), Linux, Python, Perl, Tableau, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Cognos 10, Impala, Hadoop, Redshift, PL/SQL, Oracle

Director, Data Warehouse

2008 - 2015
PriceGrabber
  • Oversaw the company's data services, defined the overall and technical strategy for data warehousing, business intelligence, and big data environments.
  • Hired and managed a mixed on-shore (US)/off-shore (India) engineering team.
  • Replatformed a data warehouse to Oracle Exadata X3/Oracle ZFS combination, added big data and machine learning components to the data warehousing environment.
  • Supported 24x7x365 operations in compliance with the company's top-level production SLA.
  • Wrote thousands of lines of PL/SQL, PL/pgSQL, MySQL, and HiveQL code.
  • Wrote ETL scripting in Perl, Python, and JavaScript internally in Kettle.
  • Worked with big data on multiple types of projects (Hadoop, Pig, Hive, and Mahaut).
  • Developed a tool-based ETL for a Pentaho (Kettle) CE ETL redesign project.
  • Worked on machine learning for various types of projects (Python, SciPy, NumPy, and Pandas).
Technologies: Pentaho, Linux, Python, Perl, MySQL, PostgreSQL, Apache Hive, Apache Pig, Hadoop, Oracle

Director, Data Warehouse

2007 - 2008
Edmunds
  • Managed a data warehouse team and project pipeline; supported operations.
  • Created PL/SQL stored procedures, packages, and anonymous scripts for ETL and data validation.
  • Worked on a tool-based ETL for multiple Informatica projects.
Technologies: Linux, Perl, Informatica, Oracle

Manager, Data Warehouse

2003 - 2007
Universal Music Group
  • Managed, developed, and operated a CRM data warehouse.
  • Wrote PL/SQL, MySQL, and Perl code.
  • Administered to a Cognos reporting system.
  • Worked on C# for multiple supporting projects for the OLAP reporting system.
  • Designed and developed a MSAS OLAP cube system.
Technologies: Linux, Perl, C#, Cognos 10, MySQL, Microsoft SQL Server, Oracle

Director, Decision Support and Financial Systems

2001 - 2003
MediaLive International
  • Managed a data warehouse, BI, and CRM systems.
  • Assumed responsibilities over an Oracle EBS application team.
  • Developed the PL/SQL coding for a data warehouse ETL and Oracle Application integration.
  • Worked with SQL server for multiple Transact-SQL and analysis service projects.
  • Worked on a tool-based ETL for multiple epiphany EPI*Channel projects.
Technologies: Unix, VB, Microsoft SQL Server, Oracle EBS, Oracle

Senior Principal Consultant (Professional Services, Essbase Practice)

1999 - 2001
Hyperion (Currently: Oracle)
  • Led a practice for a consulting company covering for multiple clients.
  • Developed Essbase satellite systems: relational data warehouses and data marts, reporting systems, ETL systems, CRM's, EPP's, ETL in and out of Essbase and with Essbase itself.
  • Worked on multiple PL/SQL projects, by providing full support of the team's Oracle project pipeline.
  • Helped to develop SQL servers for multiple Transact-SQL and analysis services projects.
  • Developed a tool-based ETL for an Informatica project.
  • Worked with Hyperion, Essbase, Enterprise, Pillar, planning, financial analyzers, and VBA projects.
Technologies: Essbase, Hyperion, Informatica, Visual Basic for Applications (VBA), Microsoft SQL Server, Oracle

Languages

Python, SQL, PL/pgSQL, Snowflake, C#, Visual Basic for Applications (VBA), VB, Scala, Q, Perl

Frameworks

Apache Spark, Spark, Presto, Hadoop

Tools

Apache Airflow, Amazon Elastic MapReduce (EMR), Amazon Athena, Pentaho Data Integration (Kettle), AWS Glue, Impala, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Tableau, Hyperion, Redash, Boto 3, Ansible, Looker, Informatica PowerCenter

Paradigms

ETL, Business Intelligence (BI), Management, Database Design, Testing

Platforms

Oracle, Databricks, Azure, Linux, Apache Pig, Pentaho, Unix, AWS Lambda, Amazon Web Services (AWS)

Storage

PostgreSQL, Apache Hive, Databases, Oracle PL/SQL, Redshift, Microsoft SQL Server, MySQL, PL/SQL, Amazon DynamoDB, Amazon S3 (AWS S3), Sybase, Kdb+, Amazon Aurora, Cassandra, Essbase, Data Lakes, Data Pipelines

Other

Data Warehousing, Data Architecture, Leadership, Team Mentoring, Technology Strategy & Architecture, Big Data, Software Development, Fivetran, Data Warehouse Design, Snowpark, Informatica, Oracle EBS, Relational Database Services (RDS), APIs, perlpod, Unix Shell Scripting, MSAS, Cognos 10, Data Build Tool (dbt), Google BigQuery, Security, Deployment, Data Modeling

Libraries/APIs

Luigi, RAPIDS, PySpark

2016 - 2016

Certificate of Completion in Data Science and Engineering with Apache Spark

UC BerkeleyX (Online Courses from Berkeley) - Berkeley, California (USA)

2012 - 2012

Certificate of Completion in Cloudera Developer Training for Apache Hadoop

Cloudera University - New York, New York (USA)

1995 - 1995

Certificate of Completion in Oracle Database Administration

UCI Extension - Irvine, California (USA)

1975 - 1980

Diploma (Master of Science Equivalent) Degree in Applied Mathematics

Odessa I.I. Mechnikov University - Odessa, Ukraine

JUNE 2023 - PRESENT

Databricks Certified Data Engineer Professional

Databricks

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring