Jagdish Wagh, Developer in Bayreuth, Bavaria, Germany
Jagdish is available for hire
Hire Jagdish

Jagdish Wagh

Verified Expert  in Engineering

Bio

Jagdish is a data engineer who has worked in different domains, including retail, insurance, and manufacturing. He has extensive experience implementing ETL pipelines and creating data models using data vault and Kimball principles. Jagdish has worked on AWS and Azure cloud and implemented a cloud data warehouse using AWS Redshift and Azure Synapse Analytics.

Portfolio

medi GmbH & Co. KG
Data Vaults, Azure Analysis Services, Azure Synapse, SQL Management Studio...
TietoEVRY
Data Analytics, Databases, Data Warehouse Design, Azure Synapse, Snowflake...
TietoEVRY
Informatica ETL, Teradata, Control-M, Amazon EC2, Matillion ETL for Redshift...

Experience

Availability

Part-time

Preferred Environment

Python, Microsoft Power BI, Data Modeling, Azure Data Factory, SQL, MSBI, Terraform, Databricks, ETL Tools, Data Warehouse Design

The most amazing...

...thing I've developed is a data warehouse and data mart application using Kimball and Data Vault 2.0 principles using on-prem and cloud environments.

Work Experience

Data Engineer

2020 - PRESENT
medi GmbH & Co. KG
  • Migrated and developed data analytics and data warehouse using on-prem and cloud technologies and implemented hybrid architecture. Worked on the different use cases, stock controlling, production order, and sales management Cockpit.
  • Created job auditing framework for monitoring data pipeline execution time and error message. Due to this process, we were able to fix the issue as soon as possible and saved a lot of time.
  • Implemented an Azure analysis cube backup and recovery process using PowerShell script in automation. Due to this process, we can recover an old cube state in a minute.
  • Created an HR management data mart for maintaining workflow throughout the organization. Because of this dashboard, the whole management can see sickness percentage, headcounts, and paternity leave, and they were able to make decisions.
Technologies: Data Vaults, Azure Analysis Services, Azure Synapse, SQL Management Studio, Terraform, Azure DevOps Services, Azure Data Lake, Azure Data Factory, Microsoft Power BI, Python, Databricks, Azure Automation, ARM, SQL, ETL, Data Governance, Visual Studio Code (VS Code), Data Warehouse Design, Database Architecture, Unix Shell Scripting, Microsoft SQL Server, Windows PowerShell, Jira, Azure Logic Apps, Cloud, Microsoft Report Builder, Cloud Architecture, PySpark, Delta Lake, Azure Data Lake Analytics, Dedicated SQL Pool (formerly SQL DW), Azure SQL Data Warehouse, Azure SQL Databases, Azure Functions, Azure Stream Analytics, Data Analytics, Data Warehousing, Microsoft, T-SQL (Transact-SQL), Azure DevOps, Database Migration, SSAS Tabular, MSBI, Data Engineering, SSRS Reports, Microsoft PowerPoint, Database Security, Database Design, HTTP REST, Pandas, SQL DDL, Spark SQL, REST APIs, Azure Administrator, SQL Server Administration, Visual Studio 2017, PyCharm, MySQL, Linux, SQL Server 2017, JSON, Data Architecture, Analytics, Microsoft Excel, Data Analysis, Data Lakes, ADF, Data Integration, Large Data Sets, Cloud Infrastructure, Microsoft Azure Cloud Server, Microsoft 365, Microsoft Power Apps, SQL Server Analysis Services (SSAS)

Senior Software Engineer

2019 - 2020
TietoEVRY
  • Created a data model and enterprise data warehouse from scratch using the Kimball approach and handed it over to the team; built the ETL data pipeline using Databricks, Azure Data Factory.
  • Created automation using PowerShell and Unix shell script to copy the data files from on-premises data center to Azure storage using AzCopy utility.
  • Contributed to the construction of a cloud data warehouse using Matillion and Informatica cloud ETL tool.
  • Worked on multiple POCs as part of a migration project from on-premises Teradata to a Snowflake database and Azure Synapse Analytics with the help of Informatica and Azure Data Factory.
  • Integrated new Azure Network Services in the existing platform using Terraform and an ARM template.
Technologies: Data Analytics, Databases, Data Warehouse Design, Azure Synapse, Snowflake, Informatica ETL, Informatica PowerCenter, Azure Data Factory, Teradata, Databricks, Azure DevOps Services, Matillion ETL for Redshift, SQL Server 2017, Azure Stream Analytics, SQL, Redshift, Python 3, Data Warehousing, Flask, REST APIs, Azure Administrator, SQL Server Administration, Visual Studio 2017, MySQL, Linux, JSON, Data Engineering, Apache Airflow, Data Architecture, Analytics, Microsoft Excel, Informatica, Data Analysis, Data Lakes, ADF, Data Integration, Large Data Sets, Cloud Infrastructure, Microsoft Azure Cloud Server, Microsoft 365, SQL Server Reporting Services (SSRS), SQL Stored Procedures, APIs

Software Engineer

2016 - 2019
TietoEVRY
  • Worked with clients to understand business needs and translate those business needs into actionable reports in QlikView and Power BI, saving 17 hours of manual work each week.
  • Designed and implemented a real-time data pipeline to process semi-structured and unstructured data by integrating 100 million raw records from eight data sources using event Kafka and PySpark and stored processed data into Teradata.
  • Analyzed the time an ETL pipeline took and improved performance by creating an index on the database and changing business logic.
  • Worked on a migration project from AWS cloud to Azure using Azure Data Factory, data migration, Databricks, Azure storage, Container, and Event Hub.
Technologies: Informatica ETL, Teradata, Control-M, Amazon EC2, Matillion ETL for Redshift, Azure Data Factory, Azure Synapse, Snowflake, ServiceNow, Databricks, Python, Visual Studio Code (VS Code), erwin Data Modeler, Data Warehouse Design, Azure Analysis Services, Data Governance, Unix Shell Scripting, Microsoft SQL Server, Windows PowerShell, Oracle, Redshift, SQL Server Integration Services (SSIS), Jira, Azure Logic Apps, Cloud, Informatica PowerCenter, Informatica Master Data Management (MDM), Kibana, Elasticsearch, Cloud Architecture, PySpark, Azure Data Lake Analytics, Dedicated SQL Pool (formerly SQL DW), Azure SQL Data Warehouse, Azure SQL Databases, Azure Functions, Azure Event Hubs, Azure Stream Analytics, Delta Lake, Data Analytics, Amazon Web Services (AWS), Data Warehousing, Microsoft, T-SQL (Transact-SQL), Azure DevOps, Database Migration, SSAS Tabular, SQL Management Studio, Microsoft Report Builder, MSBI, Data Engineering, SSRS Reports, Microsoft PowerPoint, Database Design, Azure IoT Hub, HTTP REST, Pandas, Toad, Workbench, Azure DevOps Services, Azure Automation, ARM, SQL DDL, Spark SQL, Flask, MySQL, Linux, SQL Server 2017, JSON, Apache Airflow, Data Architecture, Analytics, Microsoft Excel, Informatica, Data Analysis, Data Lakes, Data Integration, Large Data Sets, Cloud Infrastructure, Microsoft 365, SQL Server Reporting Services (SSRS), SQL Stored Procedures, APIs

Software Engineer

2014 - 2016
Trinus
  • Migrated SSIS ETL code into Informatica PowerCenter using shell script.
  • Created a job monitoring process to check all daily job statuses, fixed the process, and delivered the data to the end user.
  • Used Informatica Power Center for ETL extraction, transformation, and loading data from heterogeneous source systems into the target database. Extracted data from the web service using Informatica web service transformation-personator.
Technologies: Informatica ETL, Informatica Data Quality, Unix Shell Scripting, Amazon EC2, Redshift, SQL Server Integration Services (SSIS), SSAS Tabular, SQL, erwin Data Modeler, Data Warehouse Design, Data Governance, Control-M, Microsoft SQL Server, Windows PowerShell, Jira, Informatica PowerCenter, Informatica Master Data Management (MDM), Data Analytics, Amazon Web Services (AWS), Data Warehousing, Microsoft, T-SQL (Transact-SQL), SQL Management Studio, Cloud, MSBI, Data Engineering, Microsoft PowerPoint, Database Design, Pandas, Toad, Workbench, SQL DDL, Linux, Data Architecture, Analytics, Microsoft Excel, Informatica, Data Analysis, Data Integration, APIs

Software Engineer

2013 - 2014
The Digital Group
  • Worked legal and finance domain in a migration project from legacy applications. From Oracle Pro*C and SQL loader to data warehouse using the Informatica ETL tool and Unix shell scripting.
  • Converted application Oracle Pro*C code into the pipeline using Informatica ETL and Unix shell scripting. Created automated data quality profile rules on staging layer to cleanse the data and capture in warehouse.
  • Created reports using tables, and data delivery was high-speed. I also captured the customer's historical information so the client could see how customers changed their information.
Technologies: Informatica ETL, SQL, Databases, ETL, Unix Shell Scripting, Control-M, Data Warehouse Design, Data Governance, Microsoft SQL Server, Jira, Informatica PowerCenter, Informatica Master Data Management (MDM), Data Analytics, Data Warehousing, Microsoft, T-SQL (Transact-SQL), SQL Management Studio, MSBI, Microsoft PowerPoint, SQL DDL, Informatica, Data Integration

eCommerce and Manufacturing Data Warehouse

Implemented an end-to-end data warehouse project using the Azure cloud stack. The visualization was created in PowerBi.

Outcomes:
• Extracted data from different source systems, including ERP, relational, real-time events, and files, and loaded Datalake, a database staging layer.
• Implemented various business rules based on the customer's requirement and delivered the data to the customer in a reporting layer.
• Created end-to-end data flow until the reporting layer.
• Built Delta warehouse in Databricks.

Tools and databases:
Datalake, staging, ods, and EDW.

Data Migration to Public Cloud

I've been working on a migration project from an on-prem datacenter to the Azure cloud using Azure service, database migration, data factory, Datalake, SQL Server database, Azure DevOps, automation, logic app, and PowerBI. Created an end-to-end migration roadmap and created cloud migration architecture.
2008 - 2012

Bachelor's Degree in Information Technology

Mumbai University - Mumbai, India

2007 - 2008

High School Diploma in Science

Amravati University - Amaravati, India

APRIL 2021 - PRESENT

Hands On Essentials | Data Warehouse

Snowflake

JULY 2020 - PRESENT

Microsoft Certified | Azure Fundamentals

Microsoft

Libraries/APIs

PySpark, Pandas, REST APIs

Tools

Jira, SQL Management Studio, Informatica PowerCenter, Microsoft Excel, Terraform, Apache Airflow, Microsoft Power BI, Matillion ETL for Redshift, Informatica ETL, Control-M, Azure DevOps Services, Azure Logic Apps, Azure Automation, Microsoft Report Builder, Informatica Master Data Management (MDM), Kibana, Microsoft PowerPoint, Azure IoT Hub, Toad, Spark SQL, PyCharm, Microsoft Power Apps

Languages

SQL, T-SQL (Transact-SQL), SQL DDL, Python, Snowflake, Python 3

Frameworks

ADF, Windows PowerShell, Flask

Paradigms

Business Intelligence (BI), Database Design, Azure DevOps, ETL, DevOps

Platforms

Azure, Microsoft, Databricks, Unix, Azure Synapse, Oracle, Amazon EC2, Azure Event Hubs, Azure Functions, Azure SQL Data Warehouse, Visual Studio 2017, Linux, Dedicated SQL Pool (formerly SQL DW), Amazon Web Services (AWS), Visual Studio Code (VS Code)

Storage

Azure Cloud Services, Microsoft SQL Server, SQL Server Integration Services (SSIS), SSAS Tabular, Teradata, Data Integration, SQL Server Reporting Services (SSRS), SQL Stored Procedures, Database Architecture, Database Migration, Redshift, Azure SQL Databases, Database Security, SQL Server 2017, MySQL, JSON, Data Lakes, SQL Server Analysis Services (SSAS), Databases, Elasticsearch

Other

Data Warehousing, Data Analytics, ETL Tools, Data Modeling, Azure Data Lake, Azure Data Factory, Data Architecture, Data Vaults, Data Warehouse Design, Azure Analysis Services, Analytics, Informatica, Data Analysis, Large Data Sets, Microsoft Azure Cloud Server, APIs, Data Governance, Unix Shell Scripting, ServiceNow, ARM, MSBI, Cloud, Cloud Architecture, Delta Lake, Azure Stream Analytics, Azure Data Lake Analytics, Data Engineering, SSRS Reports, HTTP REST, Workbench, Azure Administrator, SQL Server Administration, Cloud Infrastructure, Microsoft 365, erwin Data Modeler, Informatica Data Quality

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring