Satish Basetty, Database Developer in Los Angeles, CA, United States
Satish Basetty

Database Developer in Los Angeles, CA, United States

Member since November 12, 2020
Satish is a senior data engineer with over 14 Years of experience in database and data warehouse projects in both on-premises and cloud. He is an expert in the design and development of ETL pipelines using Python and SQL over AWS Glue and Apache Airflow, along with Bash. Satish automated the processing of data of royalties and copyrights for Universal Music Group. He has provided solutions encompassing reports and visualizations, real-time data processing, migrations, and performance tuning.
Satish is now available for hire

Portfolio

  • Kitchen United
    Data Marts, Data Lakes, AWS Athena, PostgreSQL, AWS Glue, Python 3
  • Fabfitfun
    Data Marts, Qualtrics, Bash Scripting, PostgreSQL, AWS Spectrum, AWS EC2...
  • Machinima
    Redshift, Bash Scripting, PostgreSQL, Pentaho, Python 3

Experience

  • SQL 8 years
  • ETL 7 years
  • PostgreSQL 5 years
  • Query Optimization 4 years
  • ETL Pipeline 3 years

Location

Los Angeles, CA, United States

Availability

Part-time

Preferred Environment

Slack, PyCharm, Windows, Linux, MacOS

The most amazing...

...solution I've provided delivered a highly scalable daily sales reporting process.

Employment

  • Senior Data Engineer

    2020 - PRESENT
    Kitchen United
    • Developed a daily reporting process to send out reports to members. This daily process ingests the data into the data lake then the "send email" process sends the reporting emails to all members.
    • Developed the ETL pipeline to ingest the purchase data into the data lake. Created the batch job using PySpark to load the third-party sales data into the data mart.
    • Designed and developed the data mart that provides insights and visualization.
    • Automated the process for onboarding and offboarding members.
    Technologies: Data Marts, Data Lakes, AWS Athena, PostgreSQL, AWS Glue, Python 3
  • Senior Data Engineer

    2018 - 2019
    Fabfitfun
    • Designed a data mart to track the sales, CPA, and churns across various sales channels—provided a solution for automated AB testing.
    • Developed the ETL pipeline to ingest data related to the add on purchases and seasonal box delivery to members across Fabfitfun.
    • Developed the ETL pipeline for survey data ingestion.
    • Designed and developed the style data mart that provides visualizations across top-selling SKUs.
    Technologies: Data Marts, Qualtrics, Bash Scripting, PostgreSQL, AWS Spectrum, AWS EC2, Python 3, Apache Airflow
  • Senior Data Engineer

    2017 - 2018
    Machinima
    • Developed a process that provides video data insights.
    • Designed and developed the data mart that provides visualizations on the best performing videos across channels.
    • Configured the Goofys file system used as a primary source/target for most of the ELT/ETL process.
    Technologies: Redshift, Bash Scripting, PostgreSQL, Pentaho, Python 3
  • Data Engineer

    2015 - 2017
    PennyMac
    • Gathered requirements and completed data analysis, design, and development of the ELT/ETL process using Pentaho and Python.
    • Designed a data lake on AWS for various processes with data ingestion into the data warehouse Redshift and Snowflake. Worked with stakeholders in resolving issues and completing requirements.
    • Oversaw performance tuning of the queries and provided operations support.
    Technologies: Snowflake, Python 2, Pentaho, PostgreSQL
  • Senior Database Developer

    2014 - 2015
    BeachMint
    • Designed and developed ELT/ETL processes using Python.
    • Designed a sales data mart of complex queries.
    • Oversaw performance tuning of queries.
    Technologies: Redshift, Bash, Python 2, PostgreSQL
  • Senior Developer

    2013 - 2014
    Bank of America
    • Designed and developed the ETL process. Collaborated with stakeholders to resolve issues and clarify requirements.
    • Designed the order data-mart and loaded the data using the ETL Pentaho and SQL.
    • Managed the performance tuning of the queries.
    Technologies: Oracle, PostgreSQL, Python 2
  • Database developer

    2011 - 2013
    Universal Music Group
    • Developer ETL processes using Oracle PL/SQL to extract the legacy data and load it into the data mart.
    • Oversaw the performance tuning of complex queries. Gathered requirements from end-users and designed the data mart for royalties and copyrights.
    • Performed data analysis for royalties and copyrights. Created an automation process for processing the data.
    Technologies: Bash Scripting, Oracle PL/SQL
  • ETL Developer

    2007 - 2010
    Prokarma
    • Oversaw the data migration project from the legacy system to SAP.
    • Developed the ETL process to handle the car's data.
    • Collaborated with stakeholders on requirements gathering. Performed data analysis.
    Technologies: SAP FICO, Shell, Oracle PL/SQL
  • Senior Developer

    2006 - 2007
    RapidIgm Consulting
    • Developed an ETL process to perform data integration from various sources. Peformed analysis on the Rx and DDD data.
    • Designed the sales data mart and assisted with complex queries and performance tuning.
    • Collaborated with stakeholders to gather requirements and develop the data modeling.
    Technologies: SQL, Shell, Oracle

Experience

  • Sales Data Ingestion (Development)

    I developed and architected a data pipeline to ingest sales data into the data mart for Kitchen United. This process keeps track of the sales performance and supply chain across several kitchen centers and provides near real-time data insights for Menumix to the members. The data pipeline process ingests the data for purchase and menu events into the data lake, and also keeps track of the daily sales by location and members.

    I collaborated worked with the finance, marketing, data science, and BI teams and provided solutions accordingly. I helped build the data modeling that enabled the BI team to create reports and dashboards. I created a reconciliation process to keep track of the orders, a cloud to watch alerts, error reporting, and an outbound process to various third-party vendors.

  • Gaming/Video Data Ingestion for Machinima (Development)

    Assisted in the development of an end-to-end process for gaming and video data ingestion. The real-time process ingests the live gaming data into the data lake.

    I designed the data mart to track insights at the video-id grain from various channels and collaborated with the finance, email marketing and BI teams. I developed a process to ingest the sentiment data events into the data mart and configured the Goofys file system used as the primary source/target for most of the ELT/ETL process.

  • Sentiment Data Analysis (Development)

    Developed an ETL pipeline to ingest the survey data into the data lake and created a data mart for the sentiment data analysis. I collaborated with business users and designed the database views for the reporting.

Skills

  • Languages

    SQL, Python, Snowflake, Python 3, Python 2
  • Tools

    AWS Glue, Apache Airflow, AWS Athena, PyCharm, Slack
  • Paradigms

    ETL, Business Intelligence (BI)
  • Platforms

    Azure, AWS Lambda
  • Storage

    PostgreSQL, Redshift
  • Other

    Data Warehousing, Query Optimization, Data Warehouse Design, ETL Pipeline, Indexing
  • Libraries/APIs

    PySpark, Scikit-learn

Education

  • Master's degree in Computer Science
    2002 - 2005
    Texas A & M University - College Station, Texas, USA

To view more profiles

Join Toptal
Share it with others