Kireeti Basa, Data Engineer and Developer in Sydney, New South Wales, Australia
Kireeti Basa

Data Engineer and Developer in Sydney, New South Wales, Australia

Member since December 20, 2021
Kireeti is a data engineer with over 12 years of experience developing data warehouses and ELT pipelines using different cloud services, including AWS, GCP, and Azure. He's worked on data integration and data migration projects delivering complex solutions to high-profile clients. Kireeti is an expert in data modeling, data lake and data vault architectures, and modernization of existing data model(s) and excels at writing complex SQL queries and Python code for any solution.
Kireeti is now available for hire

Portfolio

  • ING Group
    Informatica ETL, SQL Server 2000, Tableau, Python...
  • Qantas
    Teradata, PL/SQL, Data Modeling, Control-M, Informatica, SQL, PostgreSQL...
  • Colruyt Group
    Informatica, SAP, Control-M, Oracle, Tableau, SQL Server BI...

Experience

Location

Sydney, New South Wales, Australia

Availability

Part-time

Preferred Environment

VS Code, PyCharm, Informatica, Google Cloud Platform (GCP), Azure

The most amazing...

...projects I've worked on involved building data warehouses, integrating CRM data, and implementing a data warehouse in a GCP using Apache Airflow.

Employment

  • Senior Data Engineer

    2019 - 2021
    ING Group
    • Integrated a new data source into the existing data model and built end-to-end data pipelines.
    • Created Apache Airflow jobs to load data from a data warehouse into a data lake.
    • Composed Tableau reports and created a data quality framework.
    • Built Looker reports in a GCP environment and generated complex visualization reports.
    Technologies: Informatica ETL, SQL Server 2000, Tableau, Python, Google Cloud Platform (GCP), Apache Airflow, BigQ, Git, Pandas, SQL, Azure, Azure SQL, Looker, Agile Sprints, ETL, Bitbucket, ASP.NET, Control-M, PySpark, Apache Spark, Data Engineering
  • Senior ETL Developer and Data Analyst

    2018 - 2019
    Qantas
    • Built the data migration plan and successfully migrated the data from the mainframe system to a new system.
    • Managed the implementation of the enterprise BI architecture and provided technical guidance to design the BI solution's roadmap.
    • Improved the performance and quality of the system after the implementation.
    Technologies: Teradata, PL/SQL, Data Modeling, Control-M, Informatica, SQL, PostgreSQL, PySpark, SQL Server Integration Services (SSIS), MSBI, Microsoft Power BI, SSIS, MySQL, NoSQL, ETL, ELT, Bitbucket, Agile Sprints, Looker
  • Lead Consultant and Team Lead

    2017 - 2018
    Colruyt Group
    • Built a data integration plan and data pipelines using Informatica.
    • Implemented an error handling framework for the Colruyt project.
    • Collaborated in designing and building a data warehouse solution to handle large data volume and addressed complex business data requirements using Informatica and Oracle.
    Technologies: Informatica, SAP, Control-M, Oracle, Tableau, SQL Server BI, Data Architecture, Data Integration, SQL, SSIS, ETL, Agile Sprints, Bitbucket, Apache Airflow
  • Informatica Developer and Tech Business Analyst

    2015 - 2017
    GlaxoSmithKline
    • Built the data model and designed the framework, which helped other teams.
    • Implemented an error handling mechanism as part of my role.
    • Designed and built Informatica solutions and pushdown optimization (PDO) where required.
    • Tuned the performance of the data warehouse operations using Teradata.
    • Developed mappings and reusable transformations in Informatica to facilitate the timely loading of a star schema data.
    Technologies: Informatica, Teradata, Veeva, Salesforce, Qlik Sense, SQL
  • Teradata Developer

    2013 - 2015
    JPMorgan Chase
    • Implemented ETL processes for the extraction of data from Oracle systems.
    • Prepared and maintained TPT scripts in Teradata as part of my role.
    • Participated in gathering and analysis of data requirements.
    • Formulated processes for maintenance and tuning of the application performance.
    • Supported data migration tasks for Teradata and DB2 systems.
    • Evaluated and documented technical, business, and design requirements.
    Technologies: Teradata, Control-M, Oracle, Stored Procedure, MSSQLCE, Python, SQL
  • ETL Developer

    2011 - 2013
    AXA XL
    • Developed the designs and solutions to complex business scenarios using SQL and Informatica.
    • Implemented an SCD type and created mapping transformations such as expressions, lookups, and filters.
    • Gathered requirements for change requests, other project modules, and sub-modules.
    • Designed and developed several Informatica components such as mappings, sessions, and tasks.
    Technologies: Informatica, Teradata, Oracle, PL/SQL, Microsoft SQL Server, SAP, SQL

Experience

  • Data Integration Collibris Project for Retail

    The project aimed to integrate financial data from newly acquired data into the existing data warehouse. It was a data integration project with multiple sources.

    I was responsible for implementing the enterprise BI architecture and providing technical guidance to design the solution roadmap. This included migrating data to the workday system using Informatica and exploring, understanding, and evaluating project requirements to build enterprise data warehouse systems following Agile and Scrum methodologies.

    I also designed and built the data warehouse solution to handle large data volume and address complex business data requirements using Informatica and Teradata and many complex services and documented them.

    I analyzed the source data and gathered requirements from the business users. I then prepared the technical specifications to develop source system mappings to load data into various tables adhering to the business rules and extensively used XML, web services, and message queues. Next, I designed and created Informatica solutions and Informatica advanced tools. I also implemented the visualizations in Looker.

  • Data Warehouse for a Bank

    In this project I:
    • Created a data warehouse in Microsoft SQL Server.
    • Built end-to-end data pipelines using the Informatica ETL tool.
    • Implemented snowflake schema and data marts to create tableau -reports.
    • Provided the data reporting functions to the business and technical community using SQL to enable the team to gain insights into various parts of the company.
    • Worked extensively on KYC data remediation, was involved in data modeling activities, and suggested design changes.
    • Cooperated with the solution architect and senior testers, exploring, understanding, and evaluating project requirements to build enterprise data warehouse systems following Agile and Scrum methodologies.
    • Provided design and expertise in developing the company's visualization methodology and the enterprise BI architecture.
    • Participated in complex business data requirements using Informatica and MSSQL. Analyzed the source data and gathered requirements from the business users.
    • Prepared technical specifications to develop source system mappings to load data into various tables adhering to the business rules.
    • Collaborated in preparing the plan and effort estimations required to execute the project.

  • Data Analytics for GSK | HealthCare

    The project aimed to integrate the Salesforce CRM data into the existing data warehouse built on Teradata with end reports generated in Qlik Sense for each sales representatives' KPIs.

    By understanding the existing source system built on Salesforce, I created facts, dimensions, and a star schema representation for the data mart. I worked extensively in the mainframe developing code, analyzed source data (VEEVA), gathered business users' requirements, and prepared technical specifications to build source system mappings to load data into various tables adhering to the business rules.

    I was involved in all phases of the SDLC, from the requirement definition, system design, development, testing, and training, to the rollout and warranty support for the production environment.

    I also collaborated in preparing the plan and effort estimations required to execute the project and played a key role in helping manage the team and work allocations.

    Finally, I was in charge of designing and building Informatica solutions and PDO where required, tuning the performance of the data warehouse operations using Teradata, and developing mappings and reusable transformations in Informatica to facilitate the timely loading of a star schema data.

Skills

  • Languages

    Python, Pine Script, SQL, R, Stored Procedure
  • Libraries/APIs

    Pandas, PySpark
  • Tools

    Informatica ETL, Apache Airflow, Excel 2016, Bitbucket, BigQuery, Microsoft Power BI, Looker, VS Code, PyCharm, Tableau, Git, TFS, Control-M, Qlik Sense, SQL Server BI
  • Paradigms

    ETL, REST
  • Platforms

    Google Cloud Platform (GCP), Oracle, Salesforce, Azure
  • Storage

    SQL Server 2000, Teradata, Azure SQL, NoSQL, MySQL, PostgreSQL, BigQ, PL/SQL, Data Integration, MSSQLCE, Microsoft SQL Server, SQL Server Integration Services (SSIS), Google Cloud SQL
  • Other

    Agile Sprints, Data Modeling, Data Architecture, SSIS, Elt, Data Warehousing, Data Engineering, MSBI, AWS, Informatica, SAP, SOAP, Veeva, ELT
  • Frameworks

    ASP.NET, Hadoop, Apache Spark

Education

  • Bachelor's Degree in Computer Science
    2006 - 2010
    Jawaharlal Nehru Technological University - Visakhapatnam, India

Certifications

  • GCP Data Engineer
    JANUARY 2021 - PRESENT
    Google Cloud
  • Informatica Developer
    JANUARY 2015 - PRESENT
    Informatica

To view more profiles

Join Toptal
Share it with others