Senior Data Engineer2020 - PRESENTKitchen United
Technologies: Data Marts, Data Lakes, AWS Athena, PostgreSQL, AWS Glue, Python 3, ETL, Google Cloud, Docker, Apache Beam, ETL Pipelines, Python, Business Intelligence (BI)
- Created a daily reporting process to send out reports to members. This daily process ingests the data into the data lake then the "send email" process sends the reporting emails to all members.
- Developed the ETL pipeline to ingest the purchase data into the data lake. Created the batch job using PySpark and Apache Beam to load the third-party sales data into the data lake.
- Designed and developed the data mart that provides insights and visualization.
- Automated the process for onboarding and offboarding members.
Senior Data Engineer2018 - 2019Fabfitfun
Technologies: Data Marts, Qualtrics, Bash Script, PostgreSQL, AWS Spectrum, Amazon EC2, Python 3, Apache Airflow, ETL, Apache Beam, ETL Pipelines, Python, Business Intelligence (BI)
- Designed a data mart to track the sales, CPA, and churns across various sales channels—provided a solution for automated AB testing.
- Developed the ETL pipeline to ingest data related to the add on purchases and seasonal box delivery to members across Fabfitfun.
- Developed the ETL pipeline for survey data ingestion.
- Designed and developed the style data mart that provides visualizations across top-selling SKUs.
Senior Data Engineer2017 - 2018Machinima
Technologies: Redshift, Bash Script, PostgreSQL, Pentaho, Python 3, ETL Pipelines, Python, Business Intelligence (BI)
- Developed a process that provides video data insights.
- Designed and developed the data mart that provides visualizations on the best performing videos across channels.
- Configured the Goofys file system used as a primary source/target for most of the ELT/ETL process.
Data Engineer2015 - 2017PennyMac
Technologies: Snowflake, Python 2, Pentaho, PostgreSQL, Python
- Gathered requirements and completed data analysis, design, and development of the ELT/ETL process using Pentaho and Python.
- Designed a data lake on AWS for various processes with data ingestion into the data warehouse Redshift and Snowflake. Worked with stakeholders in resolving issues and completing requirements.
- Oversaw performance tuning of the queries and provided operations support.
Senior Database Developer2014 - 2015BeachMint
Technologies: Redshift, Bash, Python 2, PostgreSQL
- Designed and developed ELT/ETL processes using Python.
- Designed a sales data mart of complex queries.
- Oversaw performance tuning of queries.
Senior Developer2013 - 2014Bank of America
Technologies: Oracle, PostgreSQL, Python 2
- Designed and developed the ETL process. Collaborated with stakeholders to resolve issues and clarify requirements.
- Designed the order data-mart and loaded the data using the ETL Pentaho and SQL.
- Managed the performance tuning of the queries.
Database developer2011 - 2013Universal Music Group
Technologies: Bash Script, Oracle PL/SQL
- Developer ETL processes using Oracle PL/SQL to extract the legacy data and load it into the data mart.
- Oversaw the performance tuning of complex queries. Gathered requirements from end-users and designed the data mart for royalties and copyrights.
- Performed data analysis for royalties and copyrights. Created an automation process for processing the data.
ETL Developer2007 - 2010Prokarma
Technologies: SAP FICO, Shell, Oracle PL/SQL
- Oversaw the data migration project from the legacy system to SAP.
- Developed the ETL process to handle the car's data.
- Collaborated with stakeholders on requirements gathering. Performed data analysis.
Senior Developer2006 - 2007RapidIgm Consulting
Technologies: SQL, Shell, Oracle
- Developed an ETL process to perform data integration from various sources. Peformed analysis on the Rx and DDD data.
- Designed the sales data mart and assisted with complex queries and performance tuning.
- Collaborated with stakeholders to gather requirements and develop the data modeling.