Satyanarayana Annepogu
Verified Expert in Engineering
Database Developer
Satya is a senior data engineer with over 15 years of IT experience designing and developing data warehouses for banking and insurance clients. He specializes in designing and building modern data pipelines and streams using AWS and Azure Data engineering stack. Satya is an expert in delivering modernization of enterprise data solutions using AWS and Azure cloud data technologies.
Portfolio
Experience
Availability
Preferred Environment
Data Engineering, Amazon Web Services (AWS), Azure, Databricks, Python, PySpark, Hadoop, Snowflake, Data Warehousing, ETL Tools
The most amazing...
...project I've done is designing, developing, and supporting cloud-based and traditional data warehouse applications.
Work Experience
Data Engineer
Millicom
- Led the implementation of AWS Glue for automated ETL processes, reducing data processing time and improving data accuracy for telecom network performance data, customer interactions, and billing information.
- Utilized AWS Lambda functions to develop serverless data pipelines, facilitating seamless integration between CRM systems, network infrastructure, IoT devices, and external sources within the telecom ecosystem.
- Architected solutions using Amazon S3 (AWS S3) to optimize data storage and retrieval, implementing cost-effective and scalable data lakes to accommodate large volumes of network performance data, customer interactions, and operational metrics.
- Orchestrated complex workflows using AWS Step Functions, ensuring efficient coordination and execution of multi-step data processing tasks for proactive network health monitoring and dynamic service provisioning.
- Leveraged Amazon Redshift as a data warehousing solution, enabling high-performance analytical queries to derive actionable insights into network performance, customer behavior, and service usage patterns.
- Integrated AWS Data Pipeline to automate data movement and transformation, streamlining operational processes and enhancing data availability for real-time decision-making and strategic planning.
- Implemented robust security measures using AWS Identity and Access Management (IAM) and Amazon VPC, ensuring data privacy and regulatory compliance for sensitive network performance data, customer interactions, and billing information.
- Leveraged AWS Lambda functions to create serverless data pipelines, ensuring seamless integration between disparate systems and services within the BN Bank Norway ecosystem.
- Optimized data storage and retrieval by architecting solutions using Amazon S3, implementing cost-effective and scalable data lakes to accommodate the bank's growing data volumes.
Data Analyst with Azure Data Factory expertise
Heimstaden Services AB
- Designed and Developed data Ingestion pipelines using ADF and processing layer using Databricks Notebooks with pySpark.
- Lead in the planning, design, development, testing, implementation, documentation, and support of data pipelines.
- Pause and Resume Azure SQL Data Warehouse using ADF. Various ADF Pipelines with Business Rules use cases as reusable asset.
- Azure key vault for storing connection string details, certificates and used the key vaults in Azure Data factory while creating linked services. Orchestration and Automation of the Pipelines.
- Implementation of SCD2, SCD1, daily, weekly and monthly batches. POC’s with Apache Spark using pySpark, Spark-SQL for various complex data transformation requirements.
Data Engineer (Azure) and Tech Lead
IBM
- Designed and Developed data Ingestion pipelines using ADF and processing layer using Databricks Notebooks with pySpark, Lead in the planning, design, development, testing, implementation, documentation, and support of data pipelines.
- Pause and Resume Azure SQL Data Warehouse using ADF. Various ADF Pipelines with Business Rules use cases as reusable asset. Ingestion of csv, fixed width, excel files.
- Azure key vault for storing connection string details, certificates and used the key vaults in Azure Data factory while creating linked services. Automated Pipeline failure Email notifications using Web activity.
- Orchestration and Automation of the Pipelines. POC’s with Apache Spark using pySpark, Spark-SQL for various complex data transformation requirements. PowerShell scripts for automation of pipelines.
- Collaborated with ETL teams both Client and IBM and Analyzed On- Prem Informatica based ETL solutions and designed ETL solution using Azure Data Factory Pipelines and Azure Databricks PySpark and Spark-SQL.
- Performance tuning of pipelines in Azure Data factory, Azure DataBricks.
Team Lead, Sr. ETL Consultant
IBM India
- Developed solution in highly demanding environment and provide hands on guidance to other team members. Headed complex ETL requirements and design. Implemented Informatica based ETL solution fulfilling stringent performance requirements.
- Collaborated with product development teams and senior designers to develop architectural requirements to ensure client satisfaction with product.
- Assess requirements for completeness and accuracy. Determined if requirements are actionable for ETL team. Conducted impact assessment and determine size of effort based on requirements.
- Developed full SDLC project plans to implement ETL solution and identify resource requirements. Performed as active, leading role in shaping and enhancing overall ETL Informatica architecture.
- Identify, recommend, and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverable.
- Managed build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture. Resolved difficult design and develop issues. Provided team with vision of project objectives.
- Ensured discussions and decisions lead toward closure. Maintained healthy group dynamics. Assure that team addresses all relevant issues within specifications and various standards.
- Successfully made team familiarize with customer needs, specifications, design targets, development process, design standards, techniques, and tools to support task performance.
Sr. Informatica Designer
IBM Netherlands – Apeldoorn and IBM India - Hyderabad
- Heading in functional knowledge transfer sessions with Modelers. Led Technical Design meetings for designing individual layers. Analyzing functional design document and prepared analysis sheets for individual layers.
- Extensively working on technical design generation set of documents and amended as suitable for present release.
- 100% Transition sign off for all 4 releases Ramp up post transition Successful delivery of projects during steady state Process improvements and suggestions. Involved in cross training resources across all 4 iterations.
- Identified training needs for teams and organized for training to fill knowledge gap. Received laurels and appreciation both from client and IBM Received monetary and monumental/certifications awards.
Senior ETL Developer
Genisys Integrating Systems Pvt.Ltd., Bangalore, India
- Developed mapping for type 2 dimension for updating already existing rows and inserting new rows in targets. Developed Actuate reports like Drill-Up and Drill-Down reports, Series reports and Parallel reports.
- Worked on Actuate for formatting reports related to different processes. Developed Dashboards related to generated, failed, waiting and scheduled reports with respect to quarter hour, hour, day, month, and year.
- Analyzed number of reports generated, failed, scheduled, and waiting and developed for above reports.
- Developed mapping for type 2 dimension for updating already existing rows and inserting new rows in targets.
Senior ETL Developer
Magna Infotech Pvt.Ltd., Bangalore, India
- Developed mapping for type 2 dimension for updating already existing rows and inserting new rows in targets. Worked on Actuate for formatting reports related to different processes.
- Developed Actuate reports like Drill-Up and Drill-Down reports, Series reports and Parallel reports. Analyzed number of reports generated, failed, scheduled, and waiting and developed for above reports.
- Developed Dashboards related to generated, failed, waiting and scheduled reports with respect to quarter hour, hour, day, month, and year.
- Developed mapping for type 2 dimension for updating already existing rows and inserting new rows in targets.
- Hands on experience in dimensional modelling up to ETL design.
ETL Lead Developer
TechnoSpine Solutions, Bangalore, India
- Developed mapping for type 2 dimension for updating already existing rows and inserting new rows in targets Worked on Actuate for formatting reports related to different processes.
- Developed Actuate reports like Drill-Up and Drill-Down reports, Series reports and Parallel reports. Analyzed number of reports generated, failed, scheduled, and waiting and developed for above reports.
- Developed Dashboards related to generated, failed, waiting and scheduled reports with respect to quarter hour, hour, day, month, and year.
- Developed mapping for type 2 dimension for updating already existing rows and inserting new rows in targets. Hands on experience in dimensional modelling up to ETL design Certifications and Accomplishments.
Experience
Data Engineer (Azure) and Tech Lead:
Data Engineer Azure
Implemented real-time analytics in Azure Databricks for actionable insights.
Seamless integration with Azure Data Services.
Established robust data governance and compliance measures.
Enhanced performance of data processing workflows.
Skills
Languages
Python, SQL, Snowflake, Excel VBA, Scala
Frameworks
Spark, Hadoop
Libraries/APIs
PySpark, Azure Blob Storage API
Tools
Autosys, Microsoft Power BI, Spark SQL, AWS Glue, Amazon Athena, Apache Airflow, Amazon Elastic MapReduce (EMR), Terraform
Paradigms
ETL
Platforms
Amazon Web Services (AWS), Azure, Databricks, Oracle, Unix, Azure Synapse Analytics, AWS Lambda, Azure Synapse, Apache Kafka
Storage
SQL Stored Procedures, Data Pipelines, Amazon S3 (AWS S3), Redshift, Data Integration, Microsoft SQL Server, Database Architecture, IBM Db2, PL/SQL, Netezza
Other
Data Engineering, Data Warehousing, ETL Tools, Informatica, Azure Data Factory, Azure Databricks, APIs, Big Data, Data Transformation, Big Data Architecture, Amazon RDS, Message Queues, Financial Services, Technical Leadership, Data Processing, Data, Data Analysis, Data Analytics, Data Visualization, Large-scale Projects, Teamwork, Data Modeling, ELT, Microsoft Dynamics 365, Data Build Tool (dbt), Azure Data Lake, Unix Shell Scripting, Cognos TM1, PL/SQL Tuning, Azure Data Lake Analytics
Industry Expertise
Retail & Wholesale
Education
Bachelor's Degree in Technology and Electrical Engineering
Jawaharlal Nehru Technological University - Hyderabad, India
Certifications
AWS Certified Cloud Practitioner
Amazon Web Services
Azure Data Engineer
Microsoft
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.
Start hiring