Megha Mittal, Developer in Bengaluru, Karnataka, India
Megha is available for hire
Hire Megha

Megha Mittal

Verified Expert  in Engineering

Bio

Megha has more than five years of developer experience with various Toptal clients and firms like Goldman Sachs and Amazon. She specializes in back-end platforms and large-scale real-time systems. She also has experience with leading and managing teams. Megha is well versed in relational and non-relational databases, application servers, micro-service architecture, REST APIs, Android, and AWS frameworks. She is thorough with her work and passionate about developing maintainable systems.

Portfolio

Agricarbon UK Ltd
Amazon S3 (AWS S3), Amazon DynamoDB, Amazon Web Services (AWS), AWS Amplify...
Amazon India
Java, Kotlin, Guice, Amazon Ion, Mockito, Amazon DynamoDB, Elasticsearch...
Amazon India
Java, JavaScript, SQL, Spring, Mockito, Amazon DynamoDB, Oracle SQL...

Experience

  • Java - 4 years
  • Spring - 4 years
  • AWS Lambda - 3 years
  • Amazon DynamoDB - 3 years
  • REST - 2 years
  • Elasticsearch - 1 year
  • MySQL - 1 year
  • Kotlin - 1 year

Availability

Part-time

Preferred Environment

Java, MySQL, Spark, Kotlin, Amazon Web Services (AWS), Spring Boot, Android, ETL, Python

The most amazing...

...thing I've developed is a rule-execution engine that streamlined micro-service creation and management, cutting down developer hours by almost 50 percent.

Work Experience

Engineering Lead

2021 - 2022
Agricarbon UK Ltd
  • Acted as the software engineering lead and managed the design and architecture of the entire front-end and back-end stack.
  • Hired and mentored junior developers for various projects, including an Android app, a website, ETL workflows, and a back-end server.
  • Built a back-end server using Python and Flask; hosted it on AWS Lambda and API Gateway; integrated its many APIs to a front-end website, an Android application, and other app scripts.
  • Created ETL pipelines in AWS Glue to transfer data from S3 onto Amazon Aurora RDS. Developed comprehensive post-processing frameworks in SQL to clean the data and make it ready for analysis.
  • Developed an Android application to help field operators collect soil cores from specified locations. The app featured dynamic project assignment, GPS and NFC integration, and upload/download from a cloud database.
  • Created a web front end in React and TypeScript from scratch.
  • Created and maintained the entire infrastructure of the firm on AWS, which included services like AWS RDS, Dynamo DB, Lambda, API Gateway, EC2, AWS DMS, SQS, SNS, S3, Glue, and VPC.
  • Set up continuous DB replication (CDC) from an onsite database to Amazon Aurora RDS serverless inside a VPC.
Technologies: Amazon S3 (AWS S3), Amazon DynamoDB, Amazon Web Services (AWS), AWS Amplify, AWS AppSync, AWS Lambda, JavaScript, Next.js, TypeScript, Amazon Aurora, Java, Android, Amazon API Gateway, Amazon EC2, Python 3, Flask, JSON Web Tokens (JWT), CDC, Amazon Virtual Private Cloud (VPC), AWS Glue, ETL, MySQL, Software Design, Leadership, KNIME, Data Analysis, Data Analytics, Technical Architecture, React, Microservices Architecture, Serverless, Python, Back-end Architecture, Cloud, Mobile, Architecture, Database Design, Cloud Architecture, Amazon RDS, Serverless Framework, Near-field Communication (NFC), Mobile Development, Docker, Scalability, AWS Elastic Beanstalk, Solution Architecture, DevOps, System Architecture, Web Development, Data Engineering, Project Estimation, Project Management

Software Development Engineer 2

2019 - 2020
Amazon India
  • Worked as a team lead for the India Rewards team. Involved in project planning, system design, and mentoring of junior developers apart from back-end development.
  • Designed and implemented various components of a rule-execution platform, including an Ion data configuration store and a permission control system to allow for safe data sharing between clients.
  • Designed and implemented a low-latency serving the data storage and retrieval system using Dynamo DB and Elasticsearch. This involved designing the Elasticsearch criteria to minimize the runtime complexity.
  • Reduced p90 latency of the rule-execution platform by 40% by examining the system for bottlenecks using load testing and implementing an on-box Guava cache.
  • Wrote multiple tech papers detailing our project designs and gave Samurai talks and brown bag sessions about the usage of native AWS technologies.
  • Performed code and design reviews for multiple teams.
Technologies: Java, Kotlin, Guice, Amazon Ion, Mockito, Amazon DynamoDB, Elasticsearch, Google Guava, Amazon Web Services (AWS), APIs, Databases, Integration, Debugging, Spring, Distributed Systems, NoSQL, Optimization, Spring Boot, REST APIs, AWS DevOps, Back-end, Architecture, Kibana, Full-stack, CI/CD Pipelines, Third-party APIs, Design Patterns, Data Structures, Computer Science, Amazon S3 (AWS S3), AWS Lambda, Team Mentoring, Software Design, Leadership, Back-end Architecture, Cloud, Database Design, Cloud Architecture, Amazon RDS, Serverless Framework, Scalability, DevOps, Spring Microservice, System Architecture, Web Development, Data Engineering, Technical Architecture, Project Estimation

Software Development Engineer

2018 - 2019
Amazon India
  • Involved in the improvement and maintenance of the gift card side of Amazon Pay which served thousands of TPS to customers using Amazon Pay balance all over India.
  • Involved in the development of multiple features to manage scale and provide high availability to clients. This was achieved by separating the stack for read intensive operations, adding dynamic client throttling etc.
  • Used native AWS technologies like Lambda, CloudAuth, and API Gateway to implement a client-facing Rest API which would enable any external-to-Amazon client to claim gift cards directly into customer accounts.
  • Designed a non-obtrusive solution to extend the data warehousing pipeline to multiple regions using Amazon LPT. This was required due to compliance requirements pertaining payment data.
  • Performed code reviews and mentored interns and junior developers.
Technologies: Java, JavaScript, SQL, Spring, Mockito, Amazon DynamoDB, Oracle SQL, AWS Cloud Computing Services, Amazon Virtual Private Cloud (VPC), AWS Data Pipeline Service, AWS Lambda, Amazon EC2, AWS CloudFormation, Perl, Redshift, REST, Git, Microservices, Amazon Web Services (AWS), NoSQL, Test-driven Development (TDD), APIs, Databases, Integration, Debugging, Distributed Systems, Spring Boot, Amazon API, REST APIs, OAuth 2, AWS DevOps, Back-end, Architecture, Data Pipelines, Full-stack, CI/CD Pipelines, Third-party APIs, Design Patterns, Data Structures, Computer Science, Amazon S3 (AWS S3), Team Mentoring, Back-end Architecture, Cloud, Serverless Framework, Spring Microservice

Software Development Engineer

2017 - 2018
Amazon India
  • Re-designed a data warehousing solution, cutting the operational burden of the team by 25%. This involved talking to multiple stakeholders and understanding the technical trade-offs between possible solutions.
  • Designed and implemented an Amazon retail feature allowing customers to schedule an e-GiftCard for future delivery. This involved collaborating with multiple teams and making changes in multiple microservices.
  • Worked on operational maintenance and increasing the system's robustness by creating automated test pipelines, remodeling client error flows, refactoring the code for better logging, and creating monitors and dashboards.
  • Refactored a network of six microservices to facilitate a data model change to accommodate the updated data representation from a newer UI component.
Technologies: Java, JavaScript, SQL, Spring, Mockito, Perl, Redshift, Git, Microservices, APIs, Databases, Distributed Systems, Amazon Web Services (AWS), NoSQL, AWS DevOps, Back-end, Full-stack, CI/CD Pipelines, Design Patterns, Data Structures, Computer Science, Amazon S3 (AWS S3), AWS Lambda, Back-end Architecture, Cloud, Serverless Framework, Spring Microservice

Analyst

2016 - 2017
Goldman Sachs
  • Developed a user-requested data pruning feature for a distributed data lake resulting in a 25% reduction in virtual warehouse storage requirements.
  • Developed a Spark refiner to export data from Kafka Streams into Sybase IQ virtual warehouses.
  • Migrated an operations risk assessment platform to newer technologies to facilitate usability and maintainability and cut operations load by almost 60%.
  • Maintained and improved ETL jobs written to report daily financial data by understanding business requirements and identifying bottlenecks.
Technologies: Java, Sybase, Spark, Subversion (SVN), Slang, JSI, Spring, Swagger, REST, Hadoop, APIs, Databases, Integration, SQL, Debugging, PostgreSQL, Apache Kafka, Jenkins, REST APIs, Back-end, Kafka Streams, Apache Spark, Big Data, Data Structures, Computer Science, Fintech, Back-end Architecture

Back-end Web Server

I built a back-end server from scratch using Python and Flask that was hosted in AWS Lambda and API Gateway. Its many APIs were integrated into a front-end website, an Android application, and other app scripts. I added JWT auth and role-based access control to all the APIs.

Rule Execution Platform for Amazon

An Ion-based rule-execution platform that streamlined the process of micro-service creation within Amazon. I worked as a full-stack developer, designing and implementing :
* An Ion data storage and retrieval solution using DynamoDB and Elasticsearch
* A UI portal backed by REST APIs to allow the clients to perform CRUD on all system resources
* An on-box caching solution
* A permission control system to allow for safe data sharing between clients

Messaging app for android

https://github.com/meghamittal92/OtterApp
An Android app for non-instant messaging built-in native Android. I worked as a full-stack developer, designing the frontend using material design components and Back4App hosted Parse server as the back end.

Swift Claim API for Amazon

Used native AWS technologies like Lambda, CloudAuth, and API Gateway to implement a client-facing REST API, enabling any external-to-Amazon client to claim Amazon e-Gift cards directly into customer accounts.

* Set up the AWS cloud formation stack to connect API Gateway to Amazon's internal network of microservices.

* Implemented a client library in Java which was used by multiple external clients to integrate with the API.

* Reduced average latency by almost 50% by solving Lamda cold start issue using AWS CloudWatch events.

Data Pruning Solution for GS Data Lake

A client-requested data pruning feature for Sybase IQ-based virtual warehouses that verified the presence of data in the underlying HDFS before running a pruning job on the virtual warehouse. I worked as a back-end developer and liaised with the senior developers who designed the HDFS export feature to understand the system and implement the feature in a backward compatible and maintainable manner.

Automated Trading App

https://github.com/meghamittal92/IBridgePy_Mac_Python27_64
An automated trading app in Python. I designed and implemented it using Interactive Broker APIs to buy and sell securities. A combination of multiple technical indicators like DMI and ADX was used to reach a trading decision.

Data Pipeline for Amazon

Re-designed a data warehousing pipeline thereby decreasing the operational burden of the team by 70%. This involved:

* Talking to multiple stakeholders and understanding good-to-have vs. need-to-have requirements for the streaming data. For e.g. a few pieces of missing data was tolerable but duplicate data was not.

* Investigated multiple possible solutions and the technical trade-offs between them.

* Designed and implemented a new serverless solution using Fast Data Pipelines. This involved making a backward compatible change in all six related micro-services apart from designing and implementing a new service.

* Ran daily jobs with SQL queries to reconcile the data with the old and new solutions. When the correctness of the new solution was established, made a switch to the new solution.
2013 - 2016

Master's Degree in Mathematics and Computer Science

Indian Institute Of Technology - Roorkee, India

2010 - 2013

Bachelor's Degree in Computer Science

University Of Delhi - Delhi, India

Libraries/APIs

REST APIs, Vue, Amazon API, Interactive Brokers API, NumPy, Pandas, AWS Amplify, React

Tools

Amazon Virtual Private Cloud (VPC), AWS CloudFormation, Git, Subversion (SVN), Gradle, Jenkins, Kafka Streams, Kibana, AWS AppSync, AWS Glue

Languages

Java, Python, JavaScript, SQL, TypeScript, Kotlin, Slang, Perl, HTML, CSS, C++, Python 3, SAML

Frameworks

Spring, Spring Boot, Serverless Framework, Mockito, Google Guice, Spring Microservice, Spark, Google Guava, Swagger, Hadoop, Guice, Android SDK, OAuth 2, Apache Spark, Next.js, Flask, JSON Web Tokens (JWT)

Paradigms

Back-end Architecture, Mobile Development, REST, Microservices, Design Patterns, Database Design, DevOps, Test-driven Development (TDD), ETL, Microservices Architecture, Role-based Access Control (RBAC)

Platforms

AWS Lambda, Amazon Web Services (AWS), Mobile, Amazon EC2, AWS Cloud Computing Services, Android, Parse Server, Back4App, Apache Kafka, Docker, Kubernetes, Blockchain, KNIME, AWS Elastic Beanstalk

Storage

Amazon DynamoDB, Amazon S3 (AWS S3), Elasticsearch, NoSQL, Databases, MySQL, Sybase, Oracle SQL, AWS Data Pipeline Service, Redshift, PostgreSQL, Microsoft SQL Server, Data Pipelines, Amazon Aurora, MongoDB

Industry Expertise

Project Management

Other

Back-end, Architecture, Leadership, Technical Architecture, Serverless, Cloud, Cloud Architecture, Amazon RDS, System Architecture, APIs, Integration, Algorithms, Debugging, Distributed Systems, AWS DevOps, CI/CD Pipelines, Data Structures, Computer Science, Team Mentoring, Software Design, Near-field Communication (NFC), Scalability, Solution Architecture, Data Engineering, Project Estimation, JSI, Amazon Ion, Amazon API Gateway, Sybase IQ, Material Design, Optimization, Stock Trading, Full-stack, Third-party APIs, Big Data, Bitcoin, CDC, Data Analysis, Data Analytics, Trading, Fintech, Automated Trading Software, Web Development

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring