Kireeti Basa, Developer in Sydney, New South Wales, Australia
Kireeti is available for hire
Hire Kireeti

Kireeti Basa

Verified Expert  in Engineering

Data Engineer and Developer

Sydney, New South Wales, Australia

Toptal member since December 23, 2021

Bio

Kireeti is a data engineer with over 12 years of experience developing data warehouses and ELT pipelines using different cloud services, including AWS, GCP, and Azure. He's worked on data integration and data migration projects delivering complex solutions to high-profile clients. Kireeti is an expert in data modeling, data lake and data vault architectures, and modernization of existing data model(s) and excels at writing complex SQL queries and Python code for any solution.

Portfolio

ING Group
Informatica ETL, SQL Server 2000, Tableau, Python, Google Cloud Platform (GCP)...
Qantas
Teradata, PL/SQL, Data Modeling, Control-M, Informatica, SQL, PostgreSQL...
Colruyt Group
Informatica, SAP, Control-M, Oracle, Tableau, SQL Server BI, Data Architecture...

Experience

Availability

Part-time

Preferred Environment

Visual Studio Code (VS Code), PyCharm, Informatica, Google Cloud Platform (GCP), Azure

The most amazing...

...projects I've worked on involved building data warehouses, integrating CRM data, and implementing a data warehouse in a GCP using Apache Airflow.

Work Experience

Senior Data Analyst

2019 - 2021
ING Group
  • Integrated a new data source into the existing data model and built end-to-end data pipelines.
  • Created Apache Airflow jobs to load data from a data warehouse into a data lake.
  • Composed Tableau reports and created a data quality framework.
  • Built Looker reports in a GCP environment and generated complex visualization reports.
Technologies: Informatica ETL, SQL Server 2000, Tableau, Python, Google Cloud Platform (GCP), Apache Airflow, BigQ, Git, Pandas, SQL, Azure, Azure SQL, Looker, Agile Sprints, ETL, Bitbucket, ASP.NET, Control-M, PySpark, Apache Spark, Data Engineering, Data Warehousing, BigQuery, SSAS, SSAS Tabular

Senior ETL Developer and Data Analyst

2018 - 2019
Qantas
  • Built the data migration plan and successfully migrated the data from the mainframe system to a new system.
  • Managed the implementation of the enterprise BI architecture and provided technical guidance to design the BI solution's roadmap.
  • Improved the performance and quality of the system after the implementation.
Technologies: Teradata, PL/SQL, Data Modeling, Control-M, Informatica, SQL, PostgreSQL, PySpark, SQL Server Integration Services (SSIS), MSBI, Microsoft Power BI, MySQL, NoSQL, ETL, ELT, Bitbucket, Agile Sprints, Looker

Lead Consultant and Team Lead

2017 - 2018
Colruyt Group
  • Built a data integration plan and data pipelines using Informatica.
  • Implemented an error handling framework for the Colruyt project.
  • Collaborated in designing and building a data warehouse solution to handle large data volume and addressed complex business data requirements using Informatica and Oracle.
Technologies: Informatica, SAP, Control-M, Oracle, Tableau, SQL Server BI, Data Architecture, Data Integration, SQL, SQL Server Integration Services (SSIS), ETL, Agile Sprints, Bitbucket, Apache Airflow, SSAS

Informatica Developer and Tech Business Analyst

2015 - 2017
GlaxoSmithKline
  • Built the data model and designed the framework that helped other teams.
  • Implemented an error-handling mechanism as part of my role.
  • Designed and built Informatica solutions and pushdown optimization (PDO) where required.
  • Tuned the performance of the data warehouse operations using Teradata.
  • Developed mappings and reusable transformations in Informatica to facilitate the timely loading of a star schema data.
Technologies: Informatica, Teradata, Veeva, Salesforce, Qlik Sense, SQL, SSAS

Teradata Developer

2013 - 2015
JPMorgan Chase
  • Implemented ETL processes for the extraction of data from Oracle systems.
  • Prepared and maintained TPT scripts in Teradata as part of my role.
  • Participated in gathering and analysis of data requirements.
  • Formulated processes for maintenance and tuning of the application performance.
  • Supported data migration tasks for Teradata and DB2 systems.
  • Evaluated and documented technical, business, and design requirements.
Technologies: Teradata, Control-M, Oracle, Stored Procedure, MSSQLCE, Python, SQL

ETL Developer

2011 - 2013
AXA XL
  • Developed the designs and solutions to complex business scenarios using SQL and Informatica.
  • Implemented an SCD type and created mapping transformations such as expressions, lookups, and filters.
  • Gathered requirements for change requests, other project modules, and sub-modules.
  • Designed and developed several Informatica components such as mappings, sessions, and tasks.
Technologies: Informatica, Teradata, Oracle, PL/SQL, Microsoft SQL Server, SAP, SQL

Data Integration Collibris Project for Retail

The project aimed to integrate financial data from newly acquired data into the existing data warehouse. It was a data integration project with multiple sources.

I was responsible for implementing the enterprise BI architecture and providing technical guidance to design the solution roadmap. This included migrating data to the workday system using Informatica and exploring, understanding, and evaluating project requirements to build enterprise data warehouse systems following Agile and Scrum methodologies.

I also designed and built the data warehouse solution to handle large data volume and address complex business data requirements using Informatica and Teradata and many complex services and documented them.

I analyzed the source data and gathered requirements from the business users. I then prepared the technical specifications to develop source system mappings to load data into various tables adhering to the business rules and extensively used XML, web services, and message queues. Next, I designed and created Informatica solutions and Informatica advanced tools. I also implemented the visualizations in Looker.

Data Warehouse for a Bank

In this project I:
• Created a data warehouse in Microsoft SQL Server.
• Built end-to-end data pipelines using the Informatica ETL tool.
• Implemented snowflake schema and data marts to create tableau -reports.
• Provided the data reporting functions to the business and technical community using SQL to enable the team to gain insights into various parts of the company.
• Worked extensively on KYC data remediation, was involved in data modeling activities, and suggested design changes.
• Cooperated with the solution architect and senior testers, exploring, understanding, and evaluating project requirements to build enterprise data warehouse systems following Agile and Scrum methodologies.
• Provided design and expertise in developing the company's visualization methodology and the enterprise BI architecture.
• Participated in complex business data requirements using Informatica and MSSQL. Analyzed the source data and gathered requirements from the business users.
• Prepared technical specifications to develop source system mappings to load data into various tables adhering to the business rules.
• Collaborated in preparing the plan and effort estimations required to execute the project.

Data Analytics for GSK | HealthCare

The project aimed to integrate the Salesforce CRM data into the existing data warehouse built on Teradata with end reports generated in Qlik Sense for each sales representatives' KPIs.

By understanding the existing source system built on Salesforce, I created facts, dimensions, and a star schema representation for the data mart. I worked extensively in the mainframe developing code, analyzed source data (VEEVA), gathered business users' requirements, and prepared technical specifications to build source system mappings to load data into various tables adhering to the business rules.

I was involved in all phases of the SDLC, from the requirement definition, system design, development, testing, and training, to the rollout and warranty support for the production environment.

I also collaborated in preparing the plan and effort estimations required to execute the project and played a key role in helping manage the team and work allocations.

Finally, I was in charge of designing and building Informatica solutions and PDO where required, tuning the performance of the data warehouse operations using Teradata, and developing mappings and reusable transformations in Informatica to facilitate the timely loading of a star schema data.

Looker Developer and Data Migration Developer

The client is one of the largest skincare and makeup eCommerce businesses in the world. It owns a number of brands across the globe.

We do internal reporting on these brands, sales, and conversion through our custom web interface. We are now looking to move our custom reporting solution to the Looker BI solution. The project aims to build a dashboard, reports, and views in Looker from various data sources in BigQuery (SQL) to sunset our current custom reporting interface.
2006 - 2010

Bachelor's Degree in Computer Science

Jawaharlal Nehru Technological University - Visakhapatnam, India

JANUARY 2021 - PRESENT

GCP Data Engineer

Google Cloud

JANUARY 2015 - PRESENT

Informatica Developer

Informatica

Libraries/APIs

Pandas, PySpark

Tools

Informatica ETL, Apache Airflow, Excel 2016, Bitbucket, BigQuery, SSAS, Microsoft Power BI, Looker, PyCharm, Tableau, Git, TFS, Control-M, Qlik Sense, SQL Server BI

Languages

Python, Pine Script, SQL, R, Stored Procedure

Paradigms

ETL, REST, Business Intelligence (BI)

Platforms

Google Cloud Platform (GCP), Amazon Web Services (AWS), Oracle, Salesforce, Azure, Visual Studio Code (VS Code)

Storage

SQL Server 2000, Teradata, Azure SQL, NoSQL, MySQL, PostgreSQL, BigQ, PL/SQL, Data Integration, MSSQLCE, Microsoft SQL Server, SQL Server Integration Services (SSIS), Google Cloud SQL, Data Pipelines, SSAS Tabular

Frameworks

ASP.NET, Hadoop, Apache Spark

Other

Agile Sprints, Data Modeling, Data Architecture, Data Warehousing, Data Engineering, MSBI, Informatica, SAP, SOAP, Veeva, ELT

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring