Motiram Ghimire
Verified Expert in Engineering
Data Engineer and Developer
Biratnagar, Province No. 1, Nepal
Toptal member since August 17, 2022
Motiram is a data engineer with over eight years of technical experience focused primarily on designing, developing, and implementing on-premise and cloud data warehouse systems. With his broad knowledge, experience, and expertise in retail processes, data, and systems, Motiram can translate business ask into optimum technical solutions, analyze potential impacts on other business processes, and deliver solutions that add maximum value to the business.
Portfolio
Experience
- SQL - 8 years
- ETL - 6 years
- Python - 6 years
- Database Development - 5 years
- Snowflake - 5 years
- Data Warehouse Design - 4 years
- Database Architecture - 3 years
- Microsoft Power BI - 3 years
Availability
Preferred Environment
Python, Amazon Web Services (AWS), Snowflake, ETL, Data Warehouse Design, BI Reporting, Data Processing Automation, SQL, Microsoft Power BI, Tableau
The most amazing...
...project I've done is the real-time sales integration for stores to track inventory and sales trends and generate hourly out-of-stock alert reports.
Work Experience
Solution Architect | Offshore Delivery Lead
LIS Nepal Pvt. Ltd.
- Architected and implemented end-to-end data warehousing solutions for multiple retail, healthcare, and pet industry clients, leveraging cloud services like AWS, Azure, GCP, and database technologies like Snowflake, Microsoft SQL Server, MySQL, and Redshift.
- Collaborated closely with stakeholders to gather requirements and design scalable data models tailored to industry-specific needs, resulting in improved inventory management, boosted revenue streams, and optimized supply chain management.
- Developed interactive dashboards and reports using Power BI, Tableau, and other tools, enabling stakeholders to visualize and analyze data effectively, leading to improved decision-making, enhanced customer satisfaction, and faster time-to-insights.
- Utilized Python, Azure Data Factory, and Batch services to build a data pipeline for large datasets from various sources, orchestrated using Apache Airflow as a central source of truth to reduce data processing time, data accuracy, and data access.
- Implemented cloud-based data warehousing solutions on AWS and Azure platforms, optimizing cost and performance and ensuring compliance with industry-specific data regulations to protect individuals' privacy, such as HIPAA, GDPR, and CCPA.
- Mentored cross-functional teams on best practices in data warehousing, data modeling, reporting, and fostering a culture of innovation and collaboration tailored to unique requirements.
BI Implementation Consultant | Data Architect
Logic Information Systems
- Spearheaded the design and implementation of business intelligence solutions for clients across various industries, leveraging cloud technologies such as AWS Services, Google Cloud, Azure, Snowflake, and Oracle.
- Worked closely with clients to gather requirements, conduct data analysis, and develop data models tailored to their specific needs, utilizing data warehousing concepts such as dimensional modeling, star schemas, and data lakes.
- Utilized leading BI tools such as Power BI, Tableau, QlikView, and Looker to create interactive dashboards and reports, providing stakeholders with actionable insights into key performance metrics.
- Optimized ETL processes using tools like Apache Spark, Talend, Informatica, and Apache Airflow to streamline data integration and transformation, ensuring data consistency and accuracy across multiple sources.
- Collaborated with cross-functional teams, including business analysts, developers, and project managers, to deliver BI solutions on time and within budget while continuously evaluating and adopting new technologies to improve efficiency and scalability.
Senior BI Developer
LIS Nepal Pvt. Ltd.
- Developed and implemented end-to-end business intelligence solutions for various clients, utilizing a wide range of database technologies such as SQL Server, Oracle, MySQL, PostgreSQL, Teradata, Netezza, and Redshift.
- Collaborated closely with business stakeholders to understand requirements and translate them into scalable data models and actionable insights, driving business growth and efficiency.
- Implemented ETL processes using SSIS, Azure Data Factory, Talend, Python, and other tools, ensuring data integrity and consistency across multiple sources and systems.
- Leveraged Power BI, Tableau, Looker, SSRS, Microstrategy, and other reporting tools to create interactive dashboards and reports, empowering stakeholders to visualize and analyze data effectively and make informed decisions.
- Provided technical leadership and guidance to junior developers and cross-functional teams, fostering a culture of collaboration and innovation in BI solution development and implementation.
Software Developer
Yomari Inc.
- Worked as level two batch support for Oracle Retail Analytics batch for multiple clients.
- Developed ETL and history conversion scripts for Oracle Retail Analytics.
- Created and validated OBIEE and BIP reports as per the requirements.
- Developed custom Oracle Data Integrator packages for customer ETL integration with the Oracle ecosystem.
Experience
Azure-based Healthcare Application for Inventory Management
Additionally, an automated data ingestion framework and enhanced reporting through Azure Power BI resulted in a 40% reduction in processing time and a 20% increase in data-driven decision-making. The streamlined Azure setup achieved a 15% reduction in operational costs, showcasing the tangible impact on efficiency, accuracy, and cost-effectiveness in healthcare inventory management.
Tableau for Customer Self Service Analytics
Moreover, this transition facilitated a sales-driven impact, providing stakeholders real-time insights into customer behavior and preferences. The implementation met business needs and empowered users with self-service capabilities, allowing them to make data-informed decisions on-demand. This evolution from Excel-based reports to easily accessible, standardized Tableau reports streamlined end-of-month/week reporting and drove sales efficiency by enabling data-driven decision-making at any time and from any location.
AWS Data Pipeline for Customer Analytics
I led technical requirements gathering and integration design sessions and played a vital role in shaping the project's success, contributing to a 25% acceleration in the development timeline. The culmination of these efforts, spanning 8+ months, resulted in a centralized data foundation for customer analytics. This transformative initiative enabled the client to track customer key performance indicators (KPIs) in conjunction with sales and marketing KPIs, providing a comprehensive, single source of truth. The standardized marketing reporting platform further streamlined processes, fostering a more agile and strategic approach to future growth and customer-centric strategies, leading to a 20% increase in overall operational efficiency.
Real-time Sales Integration
The system ingested sales and inventory data every 15 minutes, resulting in a 25% reduction in reporting latency. Leveraging Python, Snowflake, and Power BI, this streamlined pipeline ensured the timely generation and publication of hourly reports. Used Power BI Premium or Embedded capacity for refresh intervals, critical data security features boosted privacy, leading to a 20% increase in user confidence. This transformative project significantly improved store managers' visibility of available items, empowering proactive inventory management, informed decisions, and streamlining processes by 30%.
Omnichannel Reporting for Enterprise Data Warehouse
I implemented data security measures, safeguarding customer-sensitive data within MicroStrategy reports. Scheduled report distribution improved accessibility, resulting in a 20% increase in user efficiency. The project seamlessly transitioned to MicroStrategy, providing omnichannel key performance indicator (KPI) visibility on both mobile and web platforms. When users sought additional analysis, reports were delivered in Excel as part of the scheduled process, enhancing analytical experiences.
Data Warehouse Implementation Using Snowflake and Microsoft Azure
In technical execution, I played a key role in constructing a new data warehouse on the Snowflake database. Leveraging native Snowflake features, Azure Data Lake Storage, Snowpipes, Streams, and Azure tools, the project achieved a 30% enhancement in data processing speed. Custom integrations using Snowflake and report development on Tableau with Power BI addressed unique business analytical needs, leading to a 25% increase in report relevance.
Scheduled report distribution via various channels, including email, printer, and network drive, resulted in a 20% improvement in user accessibility. This comprehensive implementation empowered decision-makers, providing a profound understanding of business-driving factors. The documented technical design, production deployments, and functional data documents ensured clarity, contributing to a 30% increase in operational efficiency. The project significantly elevated analytical capabilities, with a 20% enhancement in data-driven decision-making for all stakeholders.
Python-based Accelerator Package for Teradata ODS
In this role, I designed, developed, and tested data pipelines using Python for the ODS Teradata layer. I provided comprehensive documentation, including architecture diagrams, process flows, and technical design documents. Collaborating closely with the onsite project lead, I communicated technical issues, blockers, milestones, and plans, ensuring a cohesive development process.
Introducing the custom Python package empowered the client's tech team to incorporate new data sources with minimal effort effortlessly. Post-implementation, the maintenance overhead was significantly reduced by around 60%, showcasing the project's success in optimizing efficiency and simplifying the integration process for ongoing data sources.
Oracle Retail Analytics Migration Project
As a tech lead, I prepared the implementation plan and upgraded the strategy. Actively participated in data profiling, reports, and table rationalization. Prepared mapping document between the legacy source system and RA. Provided technical support on OBIEE and BI Publisher (BIP) report migration.
In addition, I prepared Oracle RA cutover strategies to align the cutover plan with other integrated systems. Put together a plan for data validation between the old and newer versions of systems and across the merchandising system.
Around 8 TB of data, nearly 200 legacy reports, and around 450 manual report schedules were migrated as part of this project. Rationalized 1500 tables in existing legacy to around 1000 tables in new RA system. All dbt transformations were moved to Oracle Data Integrator (ODI), and manual report schedules were automated. This reduced around 90% of the manual effort to send report emails to each department, store, buyer, and supplier.
Oracle Retail Analytics Implementation
I was a project lead for implementing Oracle Retail Analytics for multiple clients. As part of the base implementation, I deployed the Oracle base data model and pre-built base reports and integration package in ODI.
As part of the customization and depending on business needs, I developed custom integration with source systems and custom reports on BIP, OBIEE, and Power BI. These reports were also scheduled to users via different channels like email, printer, network drive, and others.
In addition, the Oracle Apex app was created on top of its warehouse management system, which runs on the Oracle Database for scanning items and automatically generating correct prices and currency labels.
Merchandise Reclassification Support and Performance Tuning
As a developer and data analyst, I analyzed the system process and the bottleneck and prepared custom packages to improve the performance of reclassification processing in RA. I executed the reclassification process in analytics and built custom integration to build the validation process for as-is aggregation. I also added custom reporting views to enable as-is reporting.
The performance was improved by around 60% with custom integration packages for reclassification. Now, the client will also have visibility of As-Is vs. As-Was KPI performances.
Education
Bachelor's Degree in Computer Engineering
Tribhuvan University - Kathmandu, Nepal
Certifications
dbt Fundamentals
dbt Labs
Core Designer Certificate
Dataiku
Skills
Libraries/APIs
PySpark
Tools
Microsoft Excel, PyCharm, Apache Airflow, AWS Glue, AWS CLI, Amazon Elastic MapReduce (EMR), Amazon Athena, Tableau, Microsoft Power BI, Looker, Google Analytics, BigQuery, Power Query, Microsoft Access, Amazon QuickSight, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Matillion ETL for Redshift, Oracle Application Express (APEX), Microsoft Power Apps, Azure DevOps Services, dbt Cloud
Languages
Snowflake, SQL, Python, Stored Procedure, Java, C#
Paradigms
ETL, Database Development, Modular Design, Dimensional Modeling, Business Intelligence (BI), Database Design, Kanban, Scrum, Azure DevOps
Platforms
Oracle Data Integrator (ODI), Microsoft, Oracle, Unix, Microsoft Power Automate, Azure, Azure SQL Data Warehouse, Google Cloud Platform (GCP), Amazon Web Services (AWS), Jupyter Notebook, Alteryx, Dedicated SQL Pool (formerly SQL DW), Dataiku, Oracle Database, AWS Lambda, Azure Functions, Microsoft Fabric, Azure Synapse, Azure PaaS, Windows
Storage
Data Pipelines, Data Integration, Database Modeling, Databases, MySQL, RDBMS, DB, Database Architecture, Teradata, Amazon S3 (AWS S3), Redshift, SQL Performance, Teradata Databases, Data Lakes, PostgreSQL, Database Migration, SQL Stored Procedures, MongoDB, Azure SQL Databases, Microsoft SQL Server, NoSQL, JSON, Azure SQL, SQL Server Integration Services (SSIS), Database Administration (DBA)
Other
MicroStrategy, Data Warehouse Design, Scripting, Analytics, Data Analytics, BI Reporting, Data Architecture, Data Modeling, Data Analysis, Data, Reports, Reporting, Integration, Data Visualization, Interactive Dashboards, Data Engineering, Data Warehousing, Automated Data Flows, Schemas, Star Schema, Database Schema Design, Pipelines, Warehouses, ELT, Data Processing Automation, BI Reports, Relational Database Design, Unix Shell Scripting, Oracle BIP, Agile Sprints, Customer Analytics, Dashboards, APIs, Data-level Security, Job Schedulers, Scheduling, CSV, MicroStrategy Visual Insight, Online Sales, Online Shops, Web Dashboards, Azure Data Lake, Azure Data Factory, Data Migration, Migration, Cloud Migration, Amazon RDS, Consulting, Fivetran, Data Build Tool (dbt), Big Data, Documentation, Design, Data Reporting, Teamwork, Decision Support Systems, API Integration, Machine Learning, Google BigQuery, Startups, Data Science, Metabase, Microsoft Azure, Azure Databricks, Verification, Data Integrity Testing, Amazon Seller Central
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.
Start hiring