Harish Mittal, Data Engineer and Developer in Tampa, FL, United States
Harish Mittal

Data Engineer and Developer in Tampa, FL, United States

Member since February 24, 2022
Harish is an expert in data engineering transformation projects with 20+ years of experience architecting and implementing data-related enterprise systems in finance, telecommunications, and health insurance domains. He excels in Python, C++, Java, AWS, Oracle, Redshift, and Snowflake. With leadership and consulting experience at PennyMac, Citi, and Verizon, Harish is well-versed in managing all aspects of enterprise-level projects, including architecture, budgeting, planning, and delivery.
Harish is now available for hire

Portfolio

  • Freelance Client
    Amazon Web Services (AWS), Amazon Athena, AWS Lambda, AWS Glue, Redshift...
  • Buzzclan
    AWS Lambda, Amazon Athena, SQL, Pandas, Stock Market...
  • PennyMac
    Amazon Web Services (AWS), Amazon Athena, AWS Glue, AWS Lambda...

Experience

Location

Tampa, FL, United States

Availability

Full-time

Preferred Environment

Amazon Web Services (AWS), Linux, Data, Python 3, Data Engineering

The most amazing...

...work I've done was to design and develop an application to use real-time stock market data and perform algorithmic trading on the Interactive Broker platform.

Employment

  • Senior Data Engineering Consultant

    2022 - PRESENT
    Freelance Client
    • Worked as a lead data migration consultant to migrate data from the on-premise Microsoft SQL Server data warehouse to AWS Redshift.
    • Led the initial discovery phase with customers and designed an E2E approach to migrate data warehouse and business intelligence from on-premise SQL Server to AWS Redshift using various AWS technologies.
    • Wrote Python code in AWS Lambda and Glue and created scheduled jobs using EventBridge.
    Technologies: Amazon Web Services (AWS), Amazon Athena, AWS Lambda, AWS Glue, Redshift, AWS RDS
  • Consultant

    2019 - PRESENT
    Buzzclan
    • Designed and developed an application for stock market data analysis, developed strategies to find bullish and bearish trends in stocks and ETFs and used many TA-Libs.
    • Built the ETL pipelines to collect tick data (per day, four hours, and five seconds) and fundamentals for over 3,000 stocks, ETFs, futures, and SPACs using many APIs and web scrapers.
    • Transformed data calculated technical indicators, such as EMA crossover, stochastic, RSI, CCI, AO, candles, and Fibonacci levels.
    • Integrated Cathie Wood's ARK ETF daily trade data and developed views to quickly see ARK holdings and action trends.
    • Made the entire dataset and research available through a website to help users make investment decisions.
    • Developed a trading app to use real-time tick data (OHLCV) and place trades using IB-API on the Interactive Broker platform. Also developed and tested strategies using backtest.
    Technologies: AWS Lambda, Amazon Athena, SQL, Pandas, Stock Market, Algorithmic Trading Analysis, Python, Interactive Brokers API, Stock Trading, Futures & Options
  • Consultant

    2021 - 2022
    PennyMac
    • Led the implementation of a cloud data warehouse using AWS and Snowflake technologies.
    • Built ETL pipelines for mortgage loan campaigns, advertising, and marketing from Google, Facebook, Salesforce, Blend, Leadspedia, and more.
    • Fetched data using APIs and AppFlow to AWS S3, cleansed the data using Lambda, and pushed it to Snowflake using stored procedures and external tables.
    Technologies: Amazon Web Services (AWS), Amazon Athena, AWS Glue, AWS Lambda, Data Engineering, Linux, Snowflake, Python
  • Senior Vice President

    2011 - 2020
    Citi
    • Managed ETL and BI teams (25 members) to re-engineer a legacy HR data warehouse and business intelligence platform.
    • Delivered a new in-house human capital management data warehouse platform that consolidated all HR verticals under one system with centralized data security, fast time to market, and advanced analytics using MicroStrategy BI.
    • Developed data migration strategy, measurable acceptance criteria, and implemented frameworks and tools to track progress toward go-live.
    • Architected the data validation framework, which helped reduce the testing cycle from two weeks to one day.
    • Designed the data reconciliation framework used to compare more than 400 batch feeds and resulted in $1.2 million in savings for two years.
    Technologies: Data Warehousing, Data Analytics, Data Transformation, Reporting, Business Intelligence (BI), IT Management
  • Tech Lead

    2000 - 2011
    Verizon
    • Designed, developed, and managed the operations support system (OSS) to support Verizon's high-speed internet services.
    • Wrote C++ code on Unix to process orders and provision services.
    • Developed the Java code to take orders and manage inventory of telecom equipment.
    • Built custom data migration (ETL) modules using Java, JDBC, SQL, and Oracle and migrated data from many companies Verizon bought.
    Technologies: Oracle, Unix, C++, Shell Scripting, Perl, Workflow, Java 8

Experience

  • Algorithmic Trading, OHLCV Data Processing, Strategy Development, and Backtesting
    https://mystocks.mittals.info/

    I developed a Python application to collect real-time streaming tick data (OHLCV), perform TA calculations, and place automated trades using IB API on the Interactive Brokers platform. I also developed multiple strategies and ran multiple backtests using 1-minute bar data, which resulted in $400,000+ profit when traded using one contract for ES future.

    KEY ACTIVITIES
    • Developed batch application to source daily tick data for 3,000+ stocks and detect bullish and bearish trends using many TA-Libs.
    • Processed data to calculate technical indicators, such as EMA crossover, stochastic, RSI, CCI, AO, candle types, Fibonacci levels, and 52-week high or low, monthly, and weekly performance for the batch application.
    • Visualized data using Jupyter Notebook with Matplotlib to develop strategies and perform backtesting.
    • Integrated Cathie Wood's ARK ETF daily trading data.
    • Made the entire data set and research available via a website to help users make investment decisions.

Skills

  • Languages

    Python 3, SQL, C++, Java, Python, Snowflake, Perl
  • Tools

    Amazon Athena, AWS Glue, Git
  • Platforms

    AWS Lambda, Linux, Oracle, Unix, Amazon Web Services (AWS)
  • Storage

    Amazon S3 (AWS S3), Data Pipelines, MySQL, Redshift
  • Other

    IT Management, Data Engineering, Computer Science, Data Warehousing, Data Analytics, Data Transformation, Reporting, Stock Trading, Stock Market, Stock Price Analysis, Quantitative Analysis, Algorithmic Trading Analysis, Shell Scripting, Workflow, Amazon Route 53, API Gateways, Cloud, AWS RDS, Futures & Options
  • Libraries/APIs

    Pandas, Interactive Brokers API
  • Paradigms

    Business Intelligence (BI), ETL

Education

  • Bachelor's Degree in Computer Science
    1990 - 1994
    Nagpur University - Nagpur, Maharashtra, India

Certifications

  • AWS Certified Data Analytics Specialty
    NOVEMBER 2021 - NOVEMBER 2024
    AWS
  • AWS Certified Solutions Architect Associate
    MAY 2020 - MAY 2023
    AWS

To view more profiles

Join Toptal
Share it with others