Verified Expert in Engineering
Data Engineer and Developer
Kireeti is a data engineer with over 12 years of experience developing data warehouses and ELT pipelines using different cloud services, including AWS, GCP, and Azure. He's worked on data integration and data migration projects delivering complex solutions to high-profile clients. Kireeti is an expert in data modeling, data lake and data vault architectures, and modernization of existing data model(s) and excels at writing complex SQL queries and Python code for any solution.
Visual Studio Code (VS Code), PyCharm, Informatica, Google Cloud Platform (GCP), Azure
The most amazing...
...projects I've worked on involved building data warehouses, integrating CRM data, and implementing a data warehouse in a GCP using Apache Airflow.
Senior Data Analyst
- Integrated a new data source into the existing data model and built end-to-end data pipelines.
- Created Apache Airflow jobs to load data from a data warehouse into a data lake.
- Composed Tableau reports and created a data quality framework.
- Built Looker reports in a GCP environment and generated complex visualization reports.
Senior ETL Developer and Data Analyst
- Built the data migration plan and successfully migrated the data from the mainframe system to a new system.
- Managed the implementation of the enterprise BI architecture and provided technical guidance to design the BI solution's roadmap.
- Improved the performance and quality of the system after the implementation.
Lead Consultant and Team Lead
- Built a data integration plan and data pipelines using Informatica.
- Implemented an error handling framework for the Colruyt project.
- Collaborated in designing and building a data warehouse solution to handle large data volume and addressed complex business data requirements using Informatica and Oracle.
Informatica Developer and Tech Business Analyst
- Built the data model and designed the framework that helped other teams.
- Implemented an error-handling mechanism as part of my role.
- Designed and built Informatica solutions and pushdown optimization (PDO) where required.
- Tuned the performance of the data warehouse operations using Teradata.
- Developed mappings and reusable transformations in Informatica to facilitate the timely loading of a star schema data.
- Implemented ETL processes for the extraction of data from Oracle systems.
- Prepared and maintained TPT scripts in Teradata as part of my role.
- Participated in gathering and analysis of data requirements.
- Formulated processes for maintenance and tuning of the application performance.
- Supported data migration tasks for Teradata and DB2 systems.
- Evaluated and documented technical, business, and design requirements.
- Developed the designs and solutions to complex business scenarios using SQL and Informatica.
- Implemented an SCD type and created mapping transformations such as expressions, lookups, and filters.
- Gathered requirements for change requests, other project modules, and sub-modules.
- Designed and developed several Informatica components such as mappings, sessions, and tasks.
Data Integration Collibris Project for Retail
I was responsible for implementing the enterprise BI architecture and providing technical guidance to design the solution roadmap. This included migrating data to the workday system using Informatica and exploring, understanding, and evaluating project requirements to build enterprise data warehouse systems following Agile and Scrum methodologies.
I also designed and built the data warehouse solution to handle large data volume and address complex business data requirements using Informatica and Teradata and many complex services and documented them.
I analyzed the source data and gathered requirements from the business users. I then prepared the technical specifications to develop source system mappings to load data into various tables adhering to the business rules and extensively used XML, web services, and message queues. Next, I designed and created Informatica solutions and Informatica advanced tools. I also implemented the visualizations in Looker.
Data Warehouse for a Bank
• Created a data warehouse in Microsoft SQL Server.
• Built end-to-end data pipelines using the Informatica ETL tool.
• Implemented snowflake schema and data marts to create tableau -reports.
• Provided the data reporting functions to the business and technical community using SQL to enable the team to gain insights into various parts of the company.
• Worked extensively on KYC data remediation, was involved in data modeling activities, and suggested design changes.
• Cooperated with the solution architect and senior testers, exploring, understanding, and evaluating project requirements to build enterprise data warehouse systems following Agile and Scrum methodologies.
• Provided design and expertise in developing the company's visualization methodology and the enterprise BI architecture.
• Participated in complex business data requirements using Informatica and MSSQL. Analyzed the source data and gathered requirements from the business users.
• Prepared technical specifications to develop source system mappings to load data into various tables adhering to the business rules.
• Collaborated in preparing the plan and effort estimations required to execute the project.
Data Analytics for GSK | HealthCare
By understanding the existing source system built on Salesforce, I created facts, dimensions, and a star schema representation for the data mart. I worked extensively in the mainframe developing code, analyzed source data (VEEVA), gathered business users' requirements, and prepared technical specifications to build source system mappings to load data into various tables adhering to the business rules.
I was involved in all phases of the SDLC, from the requirement definition, system design, development, testing, and training, to the rollout and warranty support for the production environment.
I also collaborated in preparing the plan and effort estimations required to execute the project and played a key role in helping manage the team and work allocations.
Finally, I was in charge of designing and building Informatica solutions and PDO where required, tuning the performance of the data warehouse operations using Teradata, and developing mappings and reusable transformations in Informatica to facilitate the timely loading of a star schema data.
Looker Developer and Data Migration Developer
We do internal reporting on these brands, sales, and conversion through our custom web interface. We are now looking to move our custom reporting solution to the Looker BI solution. The project aims to build a dashboard, reports, and views in Looker from various data sources in BigQuery (SQL) to sunset our current custom reporting interface.
Python, Pine Script, SQL, R, Stored Procedure
Informatica ETL, Apache Airflow, Excel 2016, Bitbucket, BigQuery, SSAS, Microsoft Power BI, Looker, PyCharm, Tableau, Git, TFS, Control-M, Qlik Sense, SQL Server BI
ETL, REST, Business Intelligence (BI)
Google Cloud Platform (GCP), Amazon Web Services (AWS), Oracle, Salesforce, Azure, Visual Studio Code (VS Code)
SQL Server 2000, Teradata, Azure SQL, NoSQL, MySQL, PostgreSQL, BigQ, PL/SQL, Data Integration, MSSQLCE, Microsoft SQL Server, SQL Server Integration Services (SSIS), Google Cloud SQL, Data Pipelines, SSAS Tabular
Agile Sprints, Data Modeling, Data Architecture, Data Warehousing, Data Engineering, MSBI, Informatica, SAP, SOAP, Veeva, ELT
ASP.NET, Hadoop, Apache Spark
Bachelor's Degree in Computer Science
Jawaharlal Nehru Technological University - Visakhapatnam, India
GCP Data Engineer