Yugesh Shrestha, Developer in Kathmandu, Central Development Region, Nepal
Yugesh is available for hire
Hire Yugesh

Yugesh Shrestha

Verified Expert  in Engineering

Business Intelligence Developer

Location
Kathmandu, Central Development Region, Nepal
Toptal Member Since
July 15, 2021

Yugesh is a data warehouse and business intelligence developer. He is proficient in SQL for data integration and analysis and confident working with Python and Unix shell scripting for data manipulation and ETL. Yugesh is competent in leading teams, testing, E2E projects and has used Tableau, Power BI, and SSRS to build corporate dashboards and analytical reports.

Portfolio

Toptal
Snowflake, Tableau, SQL, ETL, Dashboards, Reports, Reporting, Data Analytics...
LIS Nepal Pvt
SQL, Snowflake, Python, Azure, Jira, Microsoft Power BI, Tableau, ETL...
Logic Information Systems
SQL, Snowflake, Teradata, Python, Unix Shell Scripting, Java, R, Tableau...

Experience

Availability

Part-time

Preferred Environment

Snowflake, PyCharm, DBeaver, Sublime Text, Microsoft Power BI, Tableau

The most amazing...

...project I've completed is the end-to-end implementation of a data warehousing solution, where I led the project from requirements handling to closure.

Work Experience

Data Engineer

2021 - 2022
Toptal
  • Collaborated with a data scientist to do different data transformations and aggregations in Snowflake and develop dashboards in Tableau.
  • Created interactive and multidimensional Tableau dashboards with a self-service template design.
  • Created ETL design to perform complex aggregates based on time transformations to enhance report performance.
  • Suggested different visualization ideas and built complex structures involving spider charts, geolocations, multidimensional charts, etc.
Technologies: Snowflake, Tableau, SQL, ETL, Dashboards, Reports, Reporting, Data Analytics, Business Intelligence (BI), Data Analysis, Data Visualization, Data Warehouse Design, ETL Pipelines, Database Lifecycle Management (DLM), Big Data, ETL Tools, Cloud

Senior Principal Engineer

2019 - 2021
LIS Nepal Pvt
  • Served as the technical lead on implementing DW Solutions, including data model development, ETL pipelines built using native Snowflake features and Azure Cloud Services, and Python-based scripts.
  • Collaborated with the client to understand BI requirements, provide consultation, and develop BI dashboards and analytical reports using Power BI, Tableau, and SSRS.
  • Spearheaded technical conversations in business meetings to provide insights on solutions and current technologies, discuss challenges as part of pre-sales activities, and conduct POC exercises to develop solution proposals.
Technologies: SQL, Snowflake, Python, Azure, Jira, Microsoft Power BI, Tableau, ETL, Data Engineering, Data Warehousing, Database Design, Dashboards, Reports, Reporting, Data Analytics, Business Intelligence (BI), Data Analysis, Data Visualization, Data Warehouse Design, ETL Pipelines, Database Lifecycle Management (DLM), Big Data, ETL Tools, Cloud

Implementation Consultant

2017 - 2019
Logic Information Systems
  • Assisted onsite with retail business clients to understand requirements and provide solutions using data analytics. Used tools such as Python and R for data processing and analysis and Google Cloud and Azure for cloud computing.
  • Served as a technical consultant with different clients to build Python-based ETL scripts and schedules for data warehouse implementation, predominantly focused on replacing on-premises systems to upgrade to Snowflake-based cloud data warehouse.
  • Customized an Oracle Xstore POS application for client implementation that essentially required Java programming using Spring framework.
Technologies: SQL, Snowflake, Teradata, Python, Unix Shell Scripting, Java, R, Tableau, Microsoft Power BI, Data Analytics, Data Warehousing, Data Engineering, ETL, Business Intelligence (BI), Data Analysis, Data Visualization, eCommerce, ETL Pipelines, Database Lifecycle Management (DLM), Big Data, ETL Tools

QA Consultant

2015 - 2017
Oracle
  • Contracted onsite to Oracle India as a quality assurance consultant for the Oracle Retail application from the lens of an Oracle Retail BI developer.
  • Created functional test scenarios to check for data integrity in the back end upon performing different operations in the application. Executed the test scenarios in the form of system testing, smoke testing, end-to-end, and regression testing.
  • Created SQL scripts to feed into Selenium-based automation programs for validating the data in back-end tables after executing different operations.
Technologies: Oracle Database, Oracle Retail, Oracle ADF, SQL

Senior Software Engineer

2013 - 2015
LIS Nepal, Pvt.
  • Led the development of a data integration solution using Oracle Data Integrator 11g for a retail order management system. The system included data integration between multiple databases, file systems such as XML, and input from Web API.
  • Built in-house data warehousing solution based on Korn shell scripts, and customized Oracle solutions for implementation across multiple clients.
  • Served as a support engineer in the existing implementation of data warehousing solutions based on Oracle Retail analytics. Provided fixes to batch failures, created reports for data reconciliation, and analyzed data queries from business.
Technologies: SQL, Oracle Database, Teradata, Oracle Data Integrator (ODI), Python, Unix Shell Scripting, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Business Intelligence (BI), Data Analysis, Data Visualization, ETL Pipelines, ETL Tools

Data Warehouse Solution in Snowflake

• Acted as an ETL lead on building a new data warehouse solution based on the Snowflake database using native Snowflake and Azure tools.
• Collaborated with functional and technical team members on the client-side to understand different data source systems and their functionalities.
• Cooperated with data architects and clients to understand current business challenges and find optimal integration solutions.
• Used native Snowflake tools to integrate with Azure data lake storage, Snow pipes, Streams, and tasks.
• Created documentation for technical design, production deployments, test scenarios, and functional data documents for all integrations built.
• Provided guidance and reviewed the work of team members to ensure quality and timeliness of delivery.

Inventory Optimization Using Data Analytics

• Created the technical structure in Best Buy's big data platform in Hadoop and Google Cloud Platform (cloud service) to house data from various sources, such as Oracle Retek, Teradata EDW, and Logistics, to prepare and process data efficiently.
• Processed data in Hadoop to prepare and get different data aggregations from multiple analytical and statistical processing sources and visual insights.
• Created processes and programs to integrate data between other technologies, including Hadoop, GCP, Teradata, and Oracle.
• Developed programs in R to perform statistical analysis on various levels and the nature of retail data and create visualizations for exploratory analysis and reports.
• Developed interactive visualizations in Google Data Studio to process massive file systems and get valuable insights.
• Participated in business and data meetings with different departments at Best Buy to understand the quality of data availability and a viable way to process data for advanced analytics and data science.

Data Warehouse and Power BI using Snowflake and Azure

This project is an end-to-end implementation of a Snowflake-based data warehouse and business intelligence solution for an enterprise-level finance company.

• Handled client and team communications to derive business requirements into technical design and implementation.
• Performed Azure Data Factory end-to-end pipeline development to ingest data from different sources into Snowflake and automate ETL.
• Built ETL transformations and frameworks using Snowflake stored procedures for multiple domains.
• Built interactive Power BI Dashboards to monitor and analyze data quality parameters.
• Used Azure DevOps-based deployments using CI/CD pipelines.

In-house Retail Data Warehouse Implementation

• Served as a BI developer to implement, develop, verify, and validate retail data model (LRDM) data warehouse.
• Performed unit testing and integration testing for QA validation of ETL data loading scripts.
• Developed wrapper scripts for integration of ETL loading scripts using Korn shell scripting and Teradata database.
• Used Teradata fast load utility to import data from a database system of different sources into the Teradata database.
• Participated in mapping various retail functional areas concerning the RMS source system and respective script development.

Tableau and Snowflake Developer

Created a Tableau-based application to perform analytics on customer behavior. The data source was Snowflake, which sourced data from Shopify, and the application required dynamic calculation of customer segments based on user selections.

Oracle XStore Implementation and Customization

• Developed new customization for new functionalities in Oracle Xstore using XML configurations and Java Spring-based codebase changes.
• Fixed bugs for defects raised by QA and business users.

Oracle RTMS Quality Assurance

• Authored and executed test cases on RMS modules (foundation hierarchy, foundation items, purchase orders, stock counts, and future cost) on Oracle test manager for ADF screens.
• Detailed test case design, authored, and executed RMS's Oracle retail trade management (Import) module.
• Executed HPQC-based QA test cases in various ADF RMS screens to find defects in the ADF application.
• Ensured the ADF application had correct GUI and functional behavior. Cross-validated the functional behavior of the ADF-based RMS application with that of conventional forms-based RMS.
• Carried out ad-hoc testing in ADF-based forms to ensure the application does show any unusual behavior and cover the tests that predefined test cases cannot solely cover.
• Raised and managed bugs in Bug DB regarding defects found during manual test execution.
• Created different data set scenarios as input data for automation test execution on various RMS ADF screens.

Data Warehousing Implementation Using Snowflake

• Led the project from gathering requirements to successfully going live and post-production support on implementing the Robling Retail Data Model and BI reporting solutions using Tableau and Power BI.
• Designed and developed ETL pipelines and scripts required for the data warehousing solution customized for the client using Python.
• Drove the requirement-gathering phase with clients to facilitate discovery mapping sessions.

Oracle Retail Analytics Support

• Supported projects consecutively for three Oracle RA clients: Hot Topic (USA), Gander Mountain (USA), and SSC (Pakistan) for monitoring and issue fixing of Oracle RA nightly batches.
• Provided fixes in any batch failures due to data or program issues and documentation on detailed analysis and fix resolutions.
• Created reports for the reconciliation of data between a source system and target warehouse regularly.
• Performed data level analysis for questions from businesses based on data inconsistencies in corporate reports.

Data Integration for Order Management System

• Developed a data integration system for Petco’s Teradata-based ODS system to and from different systems using the ODI 11g data integration tool.
• Led the development team in delivering the project code and prepared technical design documents for the overall integrations developed in the project.
• Performed unit, integration, and deployment testing on the integration codes developed.
• Prepared code deployment documents and assisted the QA team in deploying the delivered codes in the required environments.

Microsoft BI Suite | EDW Enhancement

This project was based on enhancements made to an existing in-house-built data warehousing environment entirely based on the Microsoft BI suite. Enhancements included building or upgrading existing components on T-SQL stored procedures, new ETL pipelines using SSIS jobs, SSRS reports, MDX queries, and SSAS Cube calculations.

My specific roles were:
• Led a team of offshore team members to work on requirement gathering, project delivery planning, tracking, and quality assurance.
• Worked as a BI developer to change existing T-SQL stored procedures, SSRS reports, and MDX Queries.
• Collaborated with the business team to conduct and support UAT sessions.
• Created or updated schedules for ETL pipelines based on Automic UC4 workflows.

Languages

SQL, Python, Snowflake, T-SQL (Transact-SQL), Java, R

Tools

Tableau, Microsoft Power BI, PyCharm, Sublime Text, Jira, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Plotly

Paradigms

Database Design, ETL, Business Intelligence (BI), Azure DevOps

Storage

Databases, SQL Server Reporting Services (SSRS), Database Lifecycle Management (DLM), DBeaver, Teradata, Oracle 11g, Data Integration, SSAS Tabular, Apache Hive, SQL Server Integration Services (SSIS)

Other

Data Warehousing, ETL Development, Data Engineering, Data Visualization, Data Analysis, ETL Pipelines, Programming, Unix Shell Scripting, Data Analytics, Multidimensional Expressions (MDX), ETL Tools, Dashboards, Reporting, Data Warehouse Design, Big Data, Cloud, Computer Skills, Algorithms, Digital Logic, Data Integrity Testing, Azure Data Lake, Azure Data Factory, Fivetran, Google BigQuery, Big Data Architecture, XStore, Reports, eCommerce

Frameworks

Oracle ADF, Hadoop, Spring

Platforms

Azure, Oracle Database, Oracle Retail, Oracle Data Integrator (ODI), Google Cloud Platform (GCP), Oracle

2010 - 2013

Bachelor's Degree in Computer Engineering

Tribhuvan University, Institute of Engineering - Kathmandu, Nepal

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring