Krishna Inapurapu, Developer in Clarington, ON, Canada
Krishna is available for hire
Hire Krishna

Krishna Inapurapu

Verified Expert  in Engineering

Data Engineer and Developer

Clarington, ON, Canada

Toptal member since November 15, 2024

Bio

With 15+ years of experience in the banking and automobile industries, Krishna specializes in designing and delivering enterprise-grade solutions using Java, Spring, and Azure data engineering. His expertise lies in architecting scalable, secure, and high-performance data infrastructures that drive innovation and operational efficiency in complex, regulated environments. Krishna excels in leveraging technology to solve challenging problems and enhance business outcomes.

Portfolio

General Motors
Oracle, Microservices, Spring Boot, Azure Databricks, FastAPI, Azure Functions...

Experience

  • PySpark - 10 years
  • Python - 10 years
  • Java EE 8 - 10 years
  • SQL - 10 years
  • Azure Data Lake - 8 years
  • Azure Databricks - 8 years
  • Azure Synapse - 6 years
  • Azure Event Hubs - 6 years

Availability

Full-time

Preferred Environment

Data Engineering, JBoss, IBM WebSphere, Application Servers, DataOps, Azure, Jira, Confluence, Jenkins, CI/CD Pipelines, DevOps, Azure DevOps, JavaBeans, Enterprise Java Beans (EJB) 3, Enterprise Java Beans (EJB), EJB 3, Spring, Git, SourceTree, Java, Python, IBM Rational ClearCase, IBM Rational ClearQuest, Cloud, Spark, Hadoop, Azure Cosmos DB, IBM Db2, Azure Event Hubs, Azure Synapse Analytics, Oozie, Data Architecture, Cosmos, MySQL, Azure Data Factory (ADF), SQL Server BI, Programming, Microservices, Oracle, Spring Boot, Azure Databricks, FastAPI, PySpark, Azure Functions, Databricks, Autosys, Apache Hive

The most amazing...

...works I've done involve designing and delivering enterprise-grade solutions using Azure Cloud, data engineering, Databricks, Java, and Spring.

Work Experience

Senior BI and Data Architect

2014 - 2024
General Motors
  • Designed efficient, cost-effective data solutions focusing on performance optimization for swift data processing. Worked skillfully on data governance and security, ensuring data management and processing compliance with government standards.
  • Performed data transformation with Azure Databricks, using aggregations and Window functions. Streamed data processing for real-time analytics, employing SQL, Java, and Python skills for back-end integration and custom processing.
  • Handled cloud solutions and migration with Azure for efficient development and deployment, utilizing Azure Functions, Data Factory, Databricks, Synapse Analytics, Event Hubs, Cosmos DB, Snowflake, Kubernetes Service, Docker, and Protocol Buffers.
Technologies: Oracle, Microservices, Spring Boot, Azure Databricks, FastAPI, Azure Functions, Databricks, Autosys, Apache Hive, Data Engineering, JBoss, IBM WebSphere, Application Servers, DataOps, Azure, Jira, Confluence, Jenkins, CI/CD Pipelines, DevOps, Azure DevOps, JavaBeans, Enterprise Java Beans (EJB) 3, Enterprise Java Beans (EJB), EJB 3, Spring, Git, SourceTree, Java, IBM Rational ClearCase, IBM Rational ClearQuest, Cloud, Spark, Hadoop, Azure Cosmos DB, IBM Db2, Azure Event Hubs, Azure Synapse Analytics, Oozie, Data Architecture, Cosmos, MySQL, Azure Data Factory (ADF), SQL Server BI, Programming, Java EE 8, PySpark, Python, SQL, Unix, Business Intelligence (BI), Snowflake, Azure Kubernetes Service (AKS), Docker

Experience

Subscription App with FastAPI, Azure Event Hubs, & Azure Databricks

A subscription management app that leverages a FastAPI RESTful service for user interactions, which sends requests to Azure Event Hubs.

Each request triggers an Azure Function, orchestrating the data pipeline for real-time processing. All data transformations occur in Azure Databricks notebooks, where PySpark is used to prepare and transform data according to business logic.

This architecture ensures scalability, efficient data handling, and a responsive user experience, enabling seamless and secure subscription management with robust back-end processing.

Enterprise Solutions with the Azure Ecosystem

A project involving secure, scalable data solutions for a banking enterprise using the Azure ecosystem, Java, and Python.

Azure Data Factory manages ETL workflows, integrating data from various sources. Data is transformed in Azure Databricks using PySpark, while Azure Synapse Analytics serves as a scalable data warehouse for analytics. Unix scripting automates processes, and Java and Python enable custom activities and transformations, enhancing performance and flexibility.

This solution supports secure, efficient data processing and reporting, meeting high compliance standards in the financial sector.

Education

2000 - 2004

Bachelor's Degree in Computer Science

Jawaharlal Nehru Technological University Hyderabad - Hyderabad, Telangana, India

Certifications

AUGUST 2021 - PRESENT

Microsoft Certified: Azure Data Engineer

Microsoft

Skills

Libraries/APIs

PySpark

Tools

Application Servers, Jira, Confluence, Jenkins, Git, SourceTree, IBM Rational ClearCase, IBM Rational ClearQuest, SQL Server BI, Oozie, Autosys, Azure Kubernetes Service (AKS)

Languages

Java EE 8, SQL, Python, Java, Snowflake

Frameworks

JavaBeans, Spring, Spring Boot, Spark, Hadoop

Paradigms

DevOps, Azure DevOps, Microservices, Business Intelligence (BI)

Platforms

Azure Synapse, Azure Event Hubs, Unix, JBoss, IBM WebSphere, Azure, Oracle, Azure Functions, Azure Synapse Analytics, Databricks, Docker

Storage

Azure Cosmos DB, MySQL, IBM Db2, Apache Hive

Other

Data Engineering, Azure Data Lake, Azure Databricks, Azure Data Factory (ADF), DataOps, CI/CD Pipelines, Enterprise Java Beans (EJB) 3, Enterprise Java Beans (EJB), EJB 3, Cosmos, Programming, Cloud, Data Architecture, FastAPI

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring