Teja Goud Kandula, Developer in Hyderabad, Telangana, India
Teja is available for hire
Hire Teja

Teja Goud Kandula

Verified Expert  in Engineering

Data Engineer and Developer

Hyderabad, Telangana, India

Toptal member since December 8, 2022

Bio

Teja is a senior data engineer with seven years of experience building robust and scalable data pipelines for clients like Ladbrokes Coral and Mohegan Sun. He specializes in setting up data pipelines from scratch, migrating data pipelines from one ETL tool to another, and handling migrations from on-premise data warehouses to the cloud. Teja is a dedicated professional with the drive and skill set to embrace new challenges and provide clients with the best solutions.

Portfolio

Timestone
SQL, Snowflake, Data Build Tool (dbt), Data Visualization, BI Reporting...
Timestone
Big Data, Data Warehousing, Apache Airflow, Dataform, GitHub, Python...
Timestone
SQL, Snowflake, Data Build Tool (dbt), Looker, Amazon S3 (AWS S3)...

Experience

  • SQL - 6 years
  • Data Engineering - 6 years
  • ETL - 6 years
  • Data Warehousing - 5 years
  • Microsoft SQL Server - 3 years
  • Snowflake - 2 years
  • Data Build Tool (dbt) - 2 years
  • Metabase - 1 year

Availability

Part-time

Preferred Environment

Snowflake, Data Build Tool (dbt), Slack, Visual Studio Code (VS Code), Windows, SQL

The most amazing...

...project I've worked on required me to master the basics of Informatica PowerCenter within three days and finish the migration in three months.

Work Experience

Senior Business Analyst

2023 - PRESENT
Timestone
  • Worked on building the data vault layer using raw data.
  • Developed required data models for reporting using dbt.
  • Implemented reporting in the Sigma computing BI tool.
Technologies: SQL, Snowflake, Data Build Tool (dbt), Data Visualization, BI Reporting, Data Vaults, Data Analytics, Dimensional Modeling, Data Analysis, GitHub, Data Cleaning, Data Cleansing, Excel 365

Senior Data Engineer

2023 - 2023
Timestone
  • Implemented data orchestration using Apache Airflow.
  • Built the data pipelines using Dataform, the equivalent of dbt.
  • Participated in architecture review calls to follow the project's changing guidelines and adhere to them.
Technologies: Big Data, Data Warehousing, Apache Airflow, Dataform, GitHub, Python, Data Analysis, Dimensional Modeling, Data Analytics

Senior Data Engineer

2023 - 2023
Timestone
  • Optimized existing Snowflake tables that are more than one terabyte in size for better performance.
  • Rebuilt existing dbt data pipelines for efficiency to achieve optimization.
  • Conducted data ingestion to Snowflake from Amazon S3 using Snowflake stages and Snowflake pipes.
  • Set up Snowpipe to auto-ingest data CSV and Parquet files from Amazon S3 to Snowflake.
Technologies: SQL, Snowflake, Data Build Tool (dbt), Looker, Amazon S3 (AWS S3), Data Modeling, ETL, Databases, Database Modeling, Database Structure, Database Development, ELT, Data Pipelines, Big Data, Data Transformation, CSV File Processing, Parquet, Warehouses, Amazon Web Services (AWS), Data Analysis, Dimensional Modeling, Data Analytics, GitHub, Data Cleaning, Data Cleansing, Data Classification

Senior Data Engineer

2022 - 2022
5x
  • Built a consolidated data warehouse to track the company's financials.
  • Created dashboards to track the bills incurred by consulting companies based on developer and infrastructure usage.
  • Developed a dashboard to track the payments received from consulting companies.
Technologies: Snowflake, SQL, Metabase, Data Build Tool (dbt), Dashboards, Data Analysis, Dimensional Modeling, Data Analytics, GitHub

Senior Data Engineer

2022 - 2022
5x
  • Built end-to-end data pipelines from scratch using dbt.
  • Built the business logic layer in Snowflake using dbt, which is the metrics layer, the single source of truth to look at any business metric.
  • Set up data ingestion from Salesforce to Snowflake using Fivetran.
  • Set up data ingestion from Quickbooks, MySQL RDS, Google Ads, Facebook Ads, and LinkedIn Ads to Snowflake using Fivetran.
  • Built reports from scratch using the open-source BI tool Metabase.
  • Developed a team tracking dashboard to track the performance of the team.
  • Set up a Snowflake Snowpipe to auto-ingest data CSV files from AWS S3 to Snowflake.
Technologies: SQL, Snowflake, Fivetran, Data Build Tool (dbt), Metabase, Data Modeling, DataWare, Data Warehouse Design, BI Reporting, Data Analysis, Database Modeling, Database Structure, MySQL, Amazon RDS, Salesforce, Google Ads, Facebook Ads, Database Development, ELT, Data Pipelines, Business Intelligence (BI), Dashboards, Data Transformation, CSV File Processing, Amazon Web Services (AWS), Dimensional Modeling, Data Analytics, GitHub

Senior Data Engineer

2021 - 2022
5x
  • Built a data model to examine user analytics from the data generated by Twilio Segment.
  • Set up a business reporting process using Google Data Studio.
  • Tested Twilio Segment configuration on the website to ensure the data team got all the information they needed to answer business questions.
Technologies: Google BigQuery, Google Data Studio, SQL, Metabase, Fivetran, Snowflake, Data Integration, Data Visualization, BI Reporting, Data Analysis, Database Development, Stored Procedure, ELT, Data Pipelines, BigQuery, Segment, Business Intelligence (BI), Dashboards, Warehouses, Dimensional Modeling, Data Analytics, GitHub

ETL Consultant

2021 - 2021
BizAcuity
  • Migrated Informatica PowerCenter mappings and workflows to Informatica Cloud.
  • Developed mappings, task flows, and mapping tasks to manage the data migration from SQL Server and Db2 to Teradata.
  • Validated the objects migrated from PowerCenter to Informatica Cloud.
  • Scheduled Informatica Intelligent Cloud Services (IICS) task flows using the Skybot scheduling tool.
  • Deployed the Informatica objects from the development environment to the production environment.
Technologies: Informatica Cloud, Teradata, IBM Db2, Microsoft SQL Server, Informatica PowerCenter, WinSCP, SQL, Data Engineering, Data Warehousing, Database Development, Stored Procedure, Data Pipelines, Informatica Intelligent Cloud Services (IICS), Warehouses, Python

Production Support Engineer

2021 - 2021
BizAcuity
  • Provided production support for a month after project delivery.
  • Created project handover documents, including the data flow and schema design.
  • Documented how to backload the data upon the failure of the task flows.
Technologies: Informatica Cloud, Teradata, Batch Scripting, SQL, Monitoring, Production, ETL, Data Engineering, Data Warehousing, Databases, Database Transactions, Database Design, Database Development, ELT, Stored Procedure, Data Pipelines, PL/SQL, Informatica Intelligent Cloud Services (IICS)

ETL Developer

2021 - 2021
BizAcuity
  • Built an audit framework using BTEQ scripts and batch files to log the Informatica Cloud session onto the Teradata database.
  • Developed mappings, task flows, and mapping tasks to manage the data migration from Microsoft SQL Server to Teradata.
  • Deployed the Informatica objects from the development environment to the production environment.
  • Documented the audit framework ETL pipeline data flow and deployment steps.
  • Prepared data mapping documents and performed data validation tasks.
  • Implemented incremental data loading into Teradata using Microsoft SQL Server CDC and IICS.
Technologies: SQL, Informatica Cloud, Batch Scripting, Microsoft SQL Server, Teradata, Windows Server 2016, Data Engineering, Data Warehousing, ETL, Orchestration, Data Integration, Databases, Data Modeling, Database Structure, Database Transactions, Database Design, Database Development, Stored Procedure, ELT, Data Pipelines, PL/SQL, Informatica Intelligent Cloud Services (IICS), CSV File Processing

Production Support Engineer

2017 - 2020
BizAcuity
  • Managed daily ETL jobs within the product environment, ensuring they were concluded and the business team received daily reporting data.
  • Ensured the daily completion of the ETL jobs in the test environment and the release of new code changes to the production environment.
  • Fixed production failures in the ETL flow and ensured the ETL process was finished according to the SLA.
Technologies: Oracle Data Integrator (ODI), Microsoft SQL Server, SQL, Jira, ETL, Oracle, PostgreSQL, Data Engineering, Data Warehousing, Production, Monitoring, Orchestration, Databases, Database Design, Database Structure, Database Transactions, Database Development, ELT, Stored Procedure, Data Pipelines, PL/SQL, Microsoft Power BI, Excel 365

Mohegan Sun

Migrated the ETL pipeline from the on-premise Informatica Powercenter to the Informatica Cloud with a team of three members. I developed the mappings, and the task flows required for the migration in under two months, validated the data, and scheduled the Informatica Cloud task flows using the Skybot scheduling tool.
2012 - 2016

Bachelor's Degree in Computer Science

Bapatla Engineering College - Bapatla, Andhra Pradesh, India

APRIL 2023 - PRESENT

Hands On Essentials – Data Engineering

Snowflake

JANUARY 2023 - PRESENT

Hands On Essentials – Data Applications

Snowflake

JANUARY 2023 - PRESENT

Hands On Essentials – Data Sharing

Snowflake

JANUARY 2023 - PRESENT

Hands On Essentials – Data Lake

Snowflake

DECEMBER 2022 - PRESENT

Hands On Essentials – Data Warehouse

Snowflake

DECEMBER 2021 - DECEMBER 2023

dbt Fundamentals

dbt Labs

Libraries/APIs

Sigma.js

Tools

BigQuery, GitHub, Slack, Informatica PowerCenter, WinSCP, Jira, Looker, cURL Command Line Tool, SnowSQL, Microsoft Power BI, Apache Airflow

Languages

Snowflake, SQL, Python, Stored Procedure, Python 3

Paradigms

ETL, Database Design, Database Development, Dimensional Modeling, Business Intelligence (BI)

Storage

Microsoft SQL Server, Teradata, Data Integration, Databases, Database Structure, Database Transactions, Data Pipelines, PostgreSQL, PL/SQL, IBM Db2, DataWare, Amazon S3 (AWS S3), Database Modeling, MySQL, MERGE, JSON

Platforms

Oracle, Windows, Windows Server 2016, Oracle Data Integrator (ODI), Visual Studio Code (VS Code), Salesforce, Google Ads, Apache Kafka, Amazon Web Services (AWS)

Frameworks

Streamlit

Other

Data Engineering, Data Warehousing, Monitoring, Production, Data Modeling, Data Analysis, ELT, Data Transformation, Data Analytics, Data Build Tool (dbt), Metabase, Fivetran, Data Visualization, Informatica Intelligent Cloud Services (IICS), Big Data, Informatica Cloud, Batch Scripting, Google BigQuery, Google Data Studio, Data Warehouse Design, Orchestration, BI Reporting, Amazon RDS, Facebook Ads, Segment, Dashboards, CSV File Processing, Parquet, Warehouses, Change Data Capture, Snowpipe, GeoJSON, Views, APIs, Web Services, Rivery, Data Vaults, Dataform, Data Cleaning, Data Cleansing, Excel 365, Data Classification

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring