Singala Mallikarjuna, Developer in Hyderabad, Telangana, India
Singala is available for hire
Hire Singala

Singala Mallikarjuna

Verified Expert  in Engineering

Data Engineer and Software Developer

Location
Hyderabad, Telangana, India
Toptal Member Since
December 26, 2022

Singala has 8+ years in the data industry, working as a data engineer and data analyst. He has expertise in BigQuery, Data Studio, Python, SQL, Airflow, GCP, and Oracle. Singala has worked in different domains, including healthcare, insurance, media, and retail, and gained experience in various data warehouses, databases, and orchestration and reporting tools. He loves to write complex queries and can adapt to most SQL and cloud platforms.

Portfolio

Intersoft Data Labs Pvt Ltd
Google BigQuery, Apache Airflow, Google Cloud Storage, Python, Jira, MySQL, SQL...
Indium Software
SQL, Apache Hive, Python, Data Pipelines, HDFS, Presto, Tableau, Jira, Pandas...
Tech Mahindra
BigQuery, ETL, Google Data Studio, Google Sheets, SQL, Google BigQuery...

Experience

Availability

Full-time

Preferred Environment

Google Cloud Platform (GCP), BigQuery, Apache Airflow, Git, Google Data Studio, Python, SQL, Cloud Dataflow, GCS

The most amazing...

...thing I've built is a data platform for the top leadership of Google to track performance metrics with a multitude of data sources and huge datasets.

Work Experience

Senior Data Engineer

2022 - 2022
Intersoft Data Labs Pvt Ltd
  • Built and maintained 25+ pipelines to migrate data from various data sources like GCS, Zendesk, Teradata, and SharePoint to BigQuery to be used for reporting purposes.
  • Created Airflow operators for data quality, data movement, and utilities.
  • Migrated a data warehouse from Teradata to BigQuery that needed moving complex queries and procedures and built the pipelines for the same.
Technologies: Google BigQuery, Apache Airflow, Google Cloud Storage, Python, Jira, MySQL, SQL, Data Pipelines, BigQuery, ETL, Data Engineering, Retail, Analytical Dashboards, Data Visualization, Google Cloud Platform (GCP), Reporting, Data Warehousing, ETL Tools, Microsoft Excel, Jupyter Notebook, Scripting Languages, Snowflake, Dashboard Design, Data Build Tool (dbt), Architecture, Data Manipulation, Data Reporting, Databases, Query Optimization

Senior Data Engineer and Analyst

2021 - 2022
Indium Software
  • Built 15+ data pipelines to migrate data from storage systems and databases to the data warehouse to alert stakeholders for data breaches and outliers.
  • Developed 5+ Tableau dashboards for quality analytics to be used by business managers.
  • Improved the query performance improvement from three hours to four minutes of the execution and slot time.
Technologies: SQL, Apache Hive, Python, Data Pipelines, HDFS, Presto, Tableau, Jira, Pandas, Google BigQuery, BigQuery, ETL, Data Engineering, Healthcare & Insurance, Data Analysis, Analytical Dashboards, Data Visualization, Data Analytics, Data Warehousing, Apache Airflow, Spark, Hadoop, ETL Tools, Microsoft Excel, Jupyter Notebook, Scripting Languages, Looker, Dashboards, Data Build Tool (dbt), Healthcare, Architecture, Data Manipulation, Data Reporting, Databases, Query Optimization

Data Analyst and Engineer

2019 - 2021
Tech Mahindra
  • Developed 6+ analytical metric data platforms for various KPI metrics, thus helping to reduce 80% of manual tracking. Created the functional documents for each platform.
  • Designed and developed platforms for cost management, SLA tracking, and problem and incident management.
  • Streamlined the data from a multitude of data sources, cleaned the data, and performed transformations.
  • Performed ETL and ELT operations with BigQuery and Python.
  • Created traceability and feasibility metrics. Worked with teams for process improvements.
Technologies: BigQuery, ETL, Google Data Studio, Google Sheets, SQL, Google BigQuery, Data Pipelines, Data Engineering, Data Analysis, Analytical Dashboards, Data Visualization, Google Cloud Platform (GCP), Data Analytics, Reporting, Data Warehousing, ETL Tools, Microsoft Excel, Scripting Languages, Looker, Dashboards, Data Modeling, Architecture, Relational Databases, Data Manipulation, Data Reporting, Databases, Query Optimization

Data Analyst and Engineer

2017 - 2019
Accenture
  • Developed 15+ analytical metric data platforms helping to reduce 30–50% of manual tracking and defining various metrics. Created the functional documents for each platform.
  • Created 15+ workflows to streamline data from various data sources into the data warehouse and schedules.
  • Brought 20% improvement in data processing and data analysis using various Google-owned tools.
  • Worked with teams for policy enforcement and built models for handling issues proactively.
  • Developed alert reports for stakeholders to realize the potential cost estimations and make decisions.
  • Dealt with huge datasets and a multitude of data sources. Heavily used descriptive statistics for analysis and insights, benefiting stakeholders for data-driven decisions.
Technologies: Google Data Studio, BigQuery, Python, SQL, Google BigQuery, Data Pipelines, ETL, Data Engineering, Healthcare & Insurance, Data Analysis, Analytical Dashboards, Data Visualization, Healthcare IT, Google Cloud Platform (GCP), Data Analytics, Reporting, Data Warehousing, Apache Airflow, ETL Tools, Microsoft Excel, Scripting Languages, Dashboard Design, Looker, Dashboards, Data Modeling, Google Forms, Architecture, Relational Databases, Data Manipulation, Data Reporting, Databases, Query Optimization, RDBMS

Oracle PL/SQL Developer

2015 - 2017
Accenture
  • Designed, developed, and implemented database objects as per client needs with over 50 change requests.
  • Followed best practices and effective documentation process on designing and testing such that 60% of rework was eliminated. Automated some of the processes, saving four man days.
  • Tuned and optimized database (SQL, PL/SQL, and ETL programs) and automated some of the processes that saved 30+ hours a week and improved overall proficiency of system and latency.
  • Provided corrective measures to improve the process and implemented various tools to help with about 5% of metrics. Transitioned the existing and suggestions to new teams while wrapping up.
Technologies: Oracle, PL/SQL, SQL Performance, Performance Tuning, SQL Loader, Data Pipelines, Microsoft Excel, Oracle PL/SQL, Scripting Languages, Healthcare, Relational Databases, Data Manipulation, Databases, Query Optimization, RDBMS

Data Engineering at Google

As a data analyst and engineer, I developed data platforms that track performance metrics with many data sources spanning different time zones, countries, and tools. Integrating, streamlining, and standardizing the data points has been a pain point for the team.

I worked with many stakeholders, guiding, training, and building tools that helped achieve standardization. I completed the project in four months using various ETL and ELT tools proprietary to Google, BigQuery, and Looker Data Studio. Top leadership used this to make decisions or understand costs, resource management, people targets, productivity, and quality.

Data Pipelines for a Retail Client

While working as a data engineer, I built data pipelines that alert stakeholders and validate and normalize the data from a multitude of high-volume data sources. I also built various Apache Airflow-customized operators and documented and formatted the data engineering team's code sources.

Data Analysis and Engineering for Uber

Served as part of a data analytics team for the safety and insurance team. I worked as a data engineer and visualizer to build data pipelines and dashboards. I also dealt with various open-source and proprietary tools for the client for orchestration and visualization. I worked in various verticals like Uber Eats and Rides.

Web API and Analysis Dashboard for a Small Broadband Company

The company provides broadband services to its customers in a small town with a base of 6000+ customers. However, there is no streamlining of the operations, sales, and bill collections, due to which unidentified defaulters have increased, and expenditure is also uncontrollable.

1. Built a web app with an issue tracker, payment integration with the third party, streamlining data system, invoicing system, and GST calculations.
2. Created a dashboard to see the analysis, employee performance, inventory management, payment cycles, bill collections, due amounts, new connections, lost connections, etc.
3. Reduced monthly costs by 30% and recovered around 9 lakh INR through unidentified defaulters.

Languages

Python, SQL, Snowflake, HTML, CSS

Libraries/APIs

Pandas

Tools

Apache Airflow, Tableau, BigQuery, Looker, Google Forms, Cloud Dataflow, Microsoft Excel, Slack, Jira, GitHub, Google Sheets, Git

Paradigms

ETL, Business Intelligence (BI)

Platforms

Google Cloud Platform (GCP), Oracle, MacOS, Windows, Linux, Jupyter Notebook

Storage

Google Cloud Storage, Data Pipelines, PL/SQL, Relational Databases, Databases, RDBMS, Oracle PL/SQL, MySQL, Apache Hive, HDFS, SQL Performance, SQL Loader, SQLite

Industry Expertise

Healthcare

Other

Google BigQuery, Google Data Studio, Data Engineering, Data Build Tool (dbt), Data Analysis, Analytical Dashboards, Visualization, Data Visualization, Data Analytics, Reporting, Data Warehousing, ETL Tools, Data Migration, Scripting Languages, Dashboard Design, Dashboards, Data Aggregation, Architecture, Data Manipulation, Data Reporting, Query Optimization, Healthcare & Insurance, Retail, CDC, Performance Tuning, Data Modeling, Google Meet, Electronics, Signal Analysis, Harvest, Healthcare IT, GCS

Frameworks

Presto, Flask, Spark, Hadoop

2010 - 2014

Bachelor's Degree in Electronics and Communications Engineering

Rajiv Gandhi University of Knowledge Technologies Basar - Basar, India

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring