Harshad Saglani, Developer in Pune, Maharashtra, India
Harshad is available for hire
Hire Harshad

Harshad Saglani

Verified Expert  in Engineering

Bio

Harshad has 20+ years of experience as a developer, data engineer, and solution architect specializing in cloud-based (Azure and AWS) enterprise software and data products. He also has 5+ years of experience as a data engineer and is an expert Python, SQL, Apache Spark, and Databricks developer.

Portfolio

Everest Group (Reinsurance) via Synechron
Azure, Azure Synapse, Azure Databricks, Azure Data Factory (ADF)...
Antibes Shipservices
Data Engineering, Odoo, Python, PostgreSQL, Databases, CRM APIs
Majesco
Oracle, Oracle PL/SQL, Data Engineering, MongoDB, Apache Spark, Data Pipelines...

Experience

  • Oracle PL/SQL - 12 years
  • SQL - 12 years
  • Data Engineering - 4 years
  • Data Warehousing - 3 years
  • Python - 2 years
  • Databricks - 1 year
  • NoSQL - 1 year
  • Apache Spark - 1 year

Availability

Part-time

Preferred Environment

SQL, NoSQL, Apache Spark, Data Engineering, Data Warehousing, Python, ETL, Databricks, Azure, Data Migration

The most amazing...

...system I've developed checks the overall health of operational systems, uncovers system issues, and increases management's confidence in the reports.

Work Experience

Lead Data Engineer

2023 - 2024
Everest Group (Reinsurance) via Synechron
  • Created data pipelines that consume raw data and harmonize, curate, and transform the data in different containers. Used Azure Databricks for data transformation.
  • Managed the project's go-live activities and automated the reconciliations for data that moved through many systems.
  • Worked as an owner of the client's accounting system that calculates the ceded premiums and losses for the reinsurance. Performed deployments, disaster recovery testing, and automated backup setups and created access management documents.
Technologies: Azure, Azure Synapse, Azure Databricks, Azure Data Factory (ADF), Azure Data Lake, Apache Spark, SQL, Python, Data Warehouse Design, ETL, ELT, Databases, Dashboards, Databricks

Developer | Odoo ERP (via Toptal)

2022 - 2022
Antibes Shipservices
  • Created a custom app for Odoo ERP using Python, PostgreSQL, and Odoo ORM API. The client used this app to merge their products into product variants without losing original product IDs and historical data.
  • Built a smart Excel sheet (using macros) for clients to quickly decide which products they want to merge.
  • Developed the app that performed many validations before merging, automatically created required product attributes, and updated prices for variants based on differences in base product and variant.
Technologies: Data Engineering, Odoo, Python, PostgreSQL, Databases, CRM APIs

Principal Architect

2015 - 2021
Majesco
  • Developed the data product using MongoDB as a NoSQL database and Apache Spark to create operational reports data, replace Oracle-based reporting, and save more than 50% of customer costs on software licenses.
  • Served as a product owner, delivering solutions, approaches, and significant enhancements to Majesco's flagship policy administration product to fulfill customer requirements. Provided consultations to the customer and implementation teams.
  • Improved the policy administration product, including an automated transaction processor, operational data store, actuarial reporting, a data migration module, and API specifications for the entire product and integration with AssureSign.
  • Built a data warehouse system that extracts data from the policy administration system and performs complex transformations for actuarial, statistical, and regulatory reporting.
  • Developed a set of utilities that checks the overall health of the operational systems, uncovers the system issues, and helps management reconcile report data from various departments and have more confidence in the summary reports.
  • Served as a product owner of a very large system, responsible for the quality and on-time delivery of product features each month. This included managing the product roadmap from the product manager and serving the sales and implementation teams.
  • Managed the product QA and development teams, coordinated the product release (DevOps) and documentation activities to ensure smooth releases to all customers, and interacted with customers to understand their concerns and needs.
Technologies: Oracle, Oracle PL/SQL, Data Engineering, MongoDB, Apache Spark, Data Pipelines, Python, ETL, ELT, PySpark, Data Migration, Product Management, Leadership, Client Interaction, SQL, Agile, Python 3, NoSQL, Data Warehousing, Boto 3, Pandas, Spark, Amazon Web Services (AWS), Functional Requirements, System Requirements, Business Requirements, Management, Migration, APIs, Databases, Integration, Business Intelligence (BI), Dashboards, CRM APIs, AWS Glue, AWS Lambda, Amazon S3 (AWS S3)

Senior Architect

2013 - 2015
Cognizant
  • Led the team that developed an Android mobile application for a major online bank in the United States.
  • Developed the application's banking and credit card functionalities, including remote check deposits, statements, fund transfers, payments, promotions, and an ATM locator.
  • Managed the development of the banking and credit card application for Android tablets.
Technologies: Management, Leadership

Lead Consultant

2005 - 2012
Capgemini
  • Gathered customer requirements and conducted feasibility analysis workshops for customers while serving as a functional lead.
  • Managed the delivery of enhancements and maintenance releases for customers as a systems analyst for the customer service team.
  • Worked as a senior developer for a customer's enterprise billing system in Oracle PL/SQL.
Technologies: Oracle, Oracle PL/SQL, Client Interaction, Functional Requirements, System Requirements, SQL, Business Requirements, Management, Leadership

Software Engineer

2001 - 2005
STGIL (Acquired by Majesco)
  • Developed new product modules, such as document generation, printing, and surcharge ratings.
  • Led product implementations on-site for customers in the United States.
  • Performed research and development for new product features. Designed and implemented the product health check tool and dependency matrix tool.
  • Created the data migration and data generation modules to load transactions from external or legacy systems into the policy administration system.
Technologies: Oracle PL/SQL, Data Migration, Client Interaction, SQL, Oracle, Data Engineering, Data Warehousing, ETL, ELT, Leadership, Migration, Databases, Dashboards

Experience

Report Reconciliation Utility

Actuarial departments in insurance companies determine the rates customers pay and receive for their policies. Actuaries review summary reports of premiums, losses, cancellations, and conversions.

The data for these reports came from multiple systems, and summary reports didn't match exactly with the operational reports from those systems. The discrepancies were caused by differences such as using effective dates versus calendar dates, back-dated transactions, and system bugs.

We needed all the reports to match to be confident that our reporting and operational systems were in good health. To accomplish this, I wrote many programs in Oracle PL/SQL that scanned the transactions from systems and compared them with the reporting data and vice versa.

This utility was a huge success and saved at least one week every month for 1-2 people who used to manually verify the discrepancies in the reports. This report also led to a high level of confidence in the system, continually uncovered system issues, and was very helpful in testing the overall health of the systems.

Data Engineering

https://github.com/harsag/hars-data-engg
In this project, I tested the bulk processing and automating data pipelines with Amazon DynamoDB, AWS Lambda, Amazon S3, Boto3, and PostgreSQL. I also tested various approaches for data transformations using SQL, pandas, and Spark (partitioning using parquet files) and compared the performances of different approaches.

Handwritten Digit Classifier

https://github.com/harsag/hars-mnist
A handwritten digest classifier based on a deep learning model trained on the Modified National Institute of Standards and Technology (MNIST). This application uses the fastai library, which uses PyTorch internally. The user can draw any digit in this application, and the application identifies it.

This project has a neural network (deep learning) model and a UI where users can draw a digit and see the result (prediction). I worked on this project to gain a working knowledge of developing and deploying a machine learning model and using it from a UI, data pipeline, or workflow.

Education

1995 - 1998

Bachelor's Degree in Computer Science and Engineering

Sant Gadge Baba Amravati University - Amravati, Maharashtra, India

1993 - 1995

Diploma in Electronics and Communication Engineering

Maharashtra State Board of Technical Education, Mumbai - Mumbai, Maharashtra, India

Certifications

JANUARY 2025 - PRESENT

Databricks Certified Associate Developer for Apache Spark 3.0 - Python

Databricks

DECEMBER 2024 - DECEMBER 2026

Databricks Certified Data Engineer Associate

Databricks

AUGUST 2020 - AUGUST 2021

Certified SAFe 5 Product Owner/Product Manager

Scaled Agile

Skills

Libraries/APIs

Pandas, PySpark

Tools

Boto 3, Odoo, Spark SQL, AWS Glue

Languages

SQL, Python 3, Python

Paradigms

Agile, Business Intelligence (BI), ETL, Management

Platforms

Oracle, Databricks, Amazon Web Services (AWS), Azure, Azure Synapse, AWS Lambda

Storage

Oracle PL/SQL, Databases, PostgreSQL, NoSQL, Data Pipelines, MongoDB, Microsoft SQL Server, Amazon S3 (AWS S3)

Frameworks

Apache Spark, Spark, Data Lakehouse, Adaptive Query Execution (AQE)

Other

Client Interaction, Computer Engineering, Computer Science, APIs, Integration, Data Engineering, Data Warehousing, ELT, Functional Requirements, Business Requirements, Dashboards, CRM APIs, Data Migration, System Requirements, Leadership, Product Management, Migration, Azure Databricks, Azure Data Factory (ADF), Azure Data Lake, Data Warehouse Design, Spark architecture

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring