Ranjit Kumar Lakshmana Gowda, Developer in Melbourne, Australia
Ranjit is available for hire
Hire Ranjit

Ranjit Kumar Lakshmana Gowda

Verified Expert  in Engineering

Data Engineer and Developer

Location
Melbourne, Australia
Toptal Member Since
November 9, 2022

Ranjit is a result-oriented professional with a demonstrated history of solving complex business problems. He has over a decade of experience in solving data problems across multiple domains working with the top companies in their respective fields, which include telecom, flight operations, retail, and agriculture, to state a few. Ranjit's technical solutions have achieved significant benefits in terms of saving time and costs and creating an impact in the data field.

Portfolio

Fujitsu Consulting
PySpark, Spark SQL, Python 3, Azure SQL, Azure Synapse, Azure Databricks...
Mindtree
Azure Cloud Services, Databricks, Oracle, SQL Server 2015, Azure Blobs...
Tecnotree
Pro*C, Oracle PL/SQL, C++, Unix Shell Scripting, Perl

Experience

Availability

Part-time

Preferred Environment

Databricks, PySpark, Oracle, Azure Synapse, Azure SQL, Oracle PL/SQL, Analytics, ELT, Spark Structured Streaming, SQL Server 2015

The most amazing...

...project I've done in the last year is with "The Yield" AI tech company by delivering solutions for continuous streaming ingestion using Databricks and Azure.

Work Experience

Senior Advanced Analytics and Data Platforms Consultant

2022 - PRESENT
Fujitsu Consulting
  • Tuned Oracle packages, procedures, and complex queries to improve performance. Executed the migration from on-premise SAS, SQL Server, and Oracle Database to Microsoft Azure using Azure Synapse Analytics. Worked on Delta Live Tables pipelines.
  • Worked in a fast-paced environment using Agile methodologies, using Jira and Azure DevOps for tracking issues and sprints.
  • Set up the cloud environment, fixed resources communicating with each other and externally when required, developed and automated pipelines, and moved from the development environment to user acceptance testing (UAT) and production.
  • Completed the entire migration from on-premise to Microsoft Azure.
  • Replaced expensive services like Azure Cosmos DB and Kubernetes with Delta Live Tables pipelines, making the project go live.
  • Performed cost and performance optimization for customers across Australia and New Zealand and helped build batch and stream pipelines using Apache Kafka and Azure Event Hubs.
  • Automated reporting of data pipelines stats, status, and events using Microsoft Power Automate and Azure Logic Apps.
Technologies: PySpark, Spark SQL, Python 3, Azure SQL, Azure Synapse, Azure Databricks, Azure Blob Storage API, Delta Lake, Delta Live Tables, Azure Blobs, Azure Logic Apps, Microsoft Power Automate, SQL Server Management Studio (SSMS), SQL Server 2016, Azure SQL Databases, Dedicated SQL Pool (formerly SQL DW), Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, Azure Key Vault, Oracle, PL/SQL Tuning

Principal Full-stack Data Engineer

2017 - 2022
Mindtree
  • Saved $12 thousand for Procter & Gamble in 2021 by re-architecting the pipelines to use Delta Live Tables instead of Azure SQL Datavase.
  • Brought the pipeline execution time from 50 to under two hours by optimizing PySpark code and processes.
  • Reduced the response time for end users of the flight operations product SITA from six minutes to under 20 seconds by archiving historical data into a data warehouse on Oracle Database.
  • Implemented Apache NiFi data pipeline to process a large data set and configure data lookups. Used Spark architecture, including Spark Core, PySpark, Spark SQL, Spark Streaming, Spark RDD, Spark SQL, code optimization, and queries.
  • Created highly efficient Delta files for data transformation activities using Delta Lake with Azure Databricks.
  • Performed data ingestion to one or more Azure services and processed the data with Azure Databricks.
  • Created pipelines in Azure Data Factory (ADF) using linked services, datasets, and pipelines to extract, transform and load data from different sources.
  • Received the 2021 Team Player Award for implementing Delta Lake for ADF pipelines, replacing Azure SQL Database, and storing procedures with Spark SQL, Delta Live Tables, and Databricks notebooks, saving over 30 hours of resource usage.
  • Delivered a customized report to compare two Oracle databases with different versions, generating a list of components and mismatched data between the databases, which was extensively used during the Oracle upgrade.
Technologies: Azure Cloud Services, Databricks, Oracle, SQL Server 2015, Azure Blobs, Azure Key Vault, Azure Administrator, Azure SQL Databases, Azure SQL, PySpark, Spark SQL, Spark Core, Azure Data Factory, Azure Data Lake, Azure Logic Apps, Microsoft Power Automate, Azure DevOps

Senior Software Engineer

2011 - 2016
Tecnotree
  • Customized, tested, and deployed the telecom billing product @billity for customers in MTN Swaziland, MTL Malawi, and MTN Uganda. Helped with testing Postpaid billing product end to end and moving to production.
  • Gathered requirements, documented software requirements specification (SRS), and generated HLD and LLD before customization code changes, showing strict adherence to best practices.
  • Worked at customer locations in various countries, winning their confidence and trust with good communication skills and expertise.
  • Performed post-pay pre-bill checks before each bill run and bill run with the revenue assurance team.
  • Managed Crystal Reports by using PL/SQL procedures.
  • Executed the dispatching of the confirmed invoices to customers.
Technologies: Pro*C, Oracle PL/SQL, C++, Unix Shell Scripting, Perl

Farming Solutions for an Australian Agricultural Company

I worked for an Australian company specializing in farming solutions using machine learning and artificial intelligence. As the data engineer on the project, I delivered the following results:

• Replaced the costly streaming architecture involving Kubernetes, Event Hubs, and Cosmos DB.
• Implemented Delta Live Tables with medallion architecture to stream data and capture data change with stability and reliance.
• Minimized using multiple Event Hubs with a smart and simple design.
• Stored data on Azure Blob Storage in Delta format instead of expensive Cosmos DB and saved the customer thousands of dollars every month.
• Helped with system design and architecture.
• Compressed the data using brotli, snappy, and zlib to store historical information.

Migration Project for an Insurance Department of the Australian Government

I worked on a migration project for an Australian government department. Working on this project, I extracted, integrated, and loaded data from over 15 sources using Azure Synapse Analytics and Azure Data Factory and transformed data into a medallion architecture. Furthermore, I helped generate meaningful reports from the data with PySpark's powerful transformations, set up Databricks SQL endpoints, and helped with seamless integration to Tableau and Microsoft Power BI.

Procter & Gamble Data Engineering Project with Mindtree

I worked on Microsoft Azure cloud services and Databricks services to create data pipelines and provide insights into products, sales, and store information for data collected from over 30 countries and multiple sources.

As a lead data engineer, I worked on requirements gathering, documented HLDs and LLDs, and focused on performance improvements and cost optimizations. I used the Delta Lakehouse platform to replace Azure SQL DB to save the customer over $7,000 monthly and bring the overall pipeline time to under 1.5 hours from over 30 hours.

SITA Flight Operations Project with Mindtree

I worked on packages and procedures for data migration from the production environment to the test environment. By automating fixes for daily bugs using DBMS jobs and procedures, I minimized manual intervention and report generation process to run from a server and spool the necessary information for other processes.

Telecom Rating and Billing Product with Tecnotree

I performed post-pay pre-bill checks before each bill run and bill run with the revenue assurance team, optimizing the billing process, handling billing changes, and assisting with resolving issues. I was in charge of Crystal Reports using PL/SQL procedures and created new packages and offers for customers.

MTN Telecom Rating and Billing Project with Tecnotree

As per the requirements provided and customer needs, I led the design and development of procedures, functions, and application programming interfaces (APIs) using PL/SQL types, interacting with the Ericsson team for interface modules.

Languages

Python 3, C++, Pro*C, Perl, Python

Frameworks

Spark Structured Streaming, Spark, Data Lakehouse

Libraries/APIs

PySpark, Azure Blob Storage API

Platforms

Azure, Databricks, Oracle, Azure Synapse, Microsoft Power Automate, Azure SQL Data Warehouse, Spark Core, Dedicated SQL Pool (formerly SQL DW)

Storage

Azure SQL, Oracle PL/SQL, Azure Cloud Services, Database Management Systems (DBMS), Azure Blobs, SQL Server Management Studio (SSMS), SQL Server 2016, Azure SQL Databases, Data Lakes, PL/SQL, PL/SQL Developer

Other

Azure Data Factory, MetaStore, Data Engineering, Unity Catalog, Software Development, Analytics, Azure Databricks, Delta Lake, Delta Live Tables, Azure Data Lake, PL/SQL Tuning, SQL Server 2015, Unix Shell Scripting, Visualization, Data, Azure Administrator, Distributed Ledger Technology (DLT), ELT

Tools

Spark SQL, Azure Logic Apps, Azure Key Vault

Paradigms

Azure DevOps

2004 - 2008

Bachelor's Degree in Information Technology

Visvesvaraya Technological University - Belagavi, Karnataka, India

SEPTEMBER 2022 - SEPTEMBER 2024

Databricks Certified Data Engineer Associate

Databricks

AUGUST 2022 - AUGUST 2023

Databricks Accredited Lakehouse Platform Fundamentals

Databricks

AUGUST 2022 - AUGUST 2024

Databricks Certified Data Analyst Associate

Databricks

JULY 2022 - JULY 2024

Databricks Certified Associate Developer for Apache Spark 3.0

Databricks

DECEMBER 2021 - DECEMBER 2023

Microsoft Azure Data Engineer Associate

Microsoft

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring