Motiram Ghimire, Developer in Biratnagar, Province No. 1, Nepal
Motiram is available for hire
Hire Motiram

Motiram Ghimire

Verified Expert  in Engineering

Data Engineer and Developer

Location
Biratnagar, Province No. 1, Nepal
Toptal Member Since
August 17, 2022

Motiram is a data engineer with over eight years of technical experience focused primarily on designing, developing, and implementing on-premise and cloud data warehouse systems. With his broad knowledge, experience, and expertise in retail processes, data, and systems, Motiram can translate business ask into optimum technical solutions, analyze potential impacts on other business processes, and deliver solutions that add maximum value to the business.

Portfolio

LIS Nepal Pvt. Ltd.
Relational Database Design, SQL, Snowflake, Data Pipelines, Apache Airflow...
Logic Information Systems
ETL, Data Architecture, MicroStrategy, Snowflake, PyCharm, Python...
LIS Nepal Pvt. Ltd.
Oracle, Teradata, SQL, Database Development, ETL, Scripting, BI Reports...

Experience

Availability

Part-time

Preferred Environment

Python, Amazon Web Services (AWS), Snowflake, ETL, Data Warehouse Design, BI Reporting, Data Processing Automation, SQL, Microsoft Power BI, Tableau

The most amazing...

...project I've done is the real-time sales integration for stores to track inventory and sales trends and generate hourly out-of-stock alert reports.

Work Experience

Solution Architect | Offshore Delivery Lead

2021 - PRESENT
LIS Nepal Pvt. Ltd.
  • Architected and implemented end-to-end data warehousing solutions for multiple retail, healthcare, and pet industry clients, leveraging cloud services like AWS, Azure, GCP, and database technologies like Snowflake, Microsoft SQL Server, MySQL, and Redshift.
  • Collaborated closely with stakeholders to gather requirements and design scalable data models tailored to industry-specific needs, resulting in improved inventory management, boosted revenue streams, and optimized supply chain management.
  • Developed interactive dashboards and reports using Power BI, Tableau, and other tools, enabling stakeholders to visualize and analyze data effectively, leading to improved decision-making, enhanced customer satisfaction, and faster time-to-insights.
  • Utilized Python, Azure Data Factory, and Batch services to build a data pipeline for large datasets from various sources, orchestrated using Apache Airflow as a central source of truth to reduce data processing time, data accuracy, and data access.
  • Implemented cloud-based data warehousing solutions on AWS and Azure platforms, optimizing cost and performance and ensuring compliance with industry-specific data regulations to protect individuals' privacy, such as HIPAA, GDPR, and CCPA.
  • Mentored cross-functional teams on best practices in data warehousing, data modeling, reporting, and fostering a culture of innovation and collaboration tailored to unique requirements.
Technologies: Relational Database Design, SQL, Snowflake, Data Pipelines, Apache Airflow, Tableau, Amazon Athena, Customer Analytics, Redshift, Python, ETL, Data Warehouse Design, Data Processing Automation, Database Development, BI Reports, Database Architecture, Scripting, Analytics, Data Integration, Data Analytics, AWS Lambda, PySpark, Amazon Elastic MapReduce (EMR), AWS CLI, AWS Glue, Amazon S3 (AWS S3), Data Modeling, Modular Design, Agile Sprints, Scrum, Kanban, Data Architecture, BI Reporting, Database Modeling, Data Analysis, Dashboards, APIs, Microsoft Power BI, Data-level Security, Microsoft Excel, Microsoft Power Automate, Online Shops, Online Sales, Microsoft, Data Visualization, Interactive Dashboards, Web Dashboards, Azure, Azure Data Lake, Data Engineering, Reports, Reporting, Job Schedulers, Scheduling, SQL Performance, Teradata Databases, API Integration, Integration, Azure Data Factory, Azure SQL Data Warehouse, Dedicated SQL Pool (formerly SQL DW), Dataiku, Decision Support Systems, Google Cloud Platform (GCP), Looker, Data Lakes, Machine Learning, PostgreSQL, Data Migration, Database Migration, Data Warehousing, Google Analytics, Google BigQuery, Startups, Dimensional Modeling, Amazon Web Services (AWS), Cloud Migration, Business Intelligence (BI), Data Science, Automated Data Flows, MongoDB, Schemas, Jupyter Notebook, Metabase, Amazon RDS, Consulting, RDBMS, BigQuery, Matillion ETL for Redshift, Star Schema, Oracle Application Express (APEX), Azure SQL Databases, Fivetran, Database Schema Design, Data Build Tool (dbt), Pipelines, Big Data, Documentation, Design, Power Query, Microsoft SQL Server, Data Reporting, Alteryx, Microsoft Fabric, Azure Synapse, Warehouses, ELT, NoSQL, Azure Databricks, Verification, Azure PaaS, Teamwork, Amazon QuickSight

BI Implementation Consultant | Data Architect

2017 - 2020
Logic Information Systems
  • Spearheaded the design and implementation of business intelligence solutions for clients across various industries, leveraging cloud technologies such as AWS Services, Google Cloud, Azure, Snowflake, and Oracle.
  • Worked closely with clients to gather requirements, conduct data analysis, and develop data models tailored to their specific needs, utilizing data warehousing concepts such as dimensional modeling, star schemas, and data lakes.
  • Utilized leading BI tools such as Power BI, Tableau, QlikView, and Looker to create interactive dashboards and reports, providing stakeholders with actionable insights into key performance metrics.
  • Optimized ETL processes using tools like Apache Spark, Talend, Informatica, and Apache Airflow to streamline data integration and transformation, ensuring data consistency and accuracy across multiple sources.
  • Collaborated with cross-functional teams, including business analysts, developers, and project managers, to deliver BI solutions on time and within budget while continuously evaluating and adopting new technologies to improve efficiency and scalability.
Technologies: ETL, Data Architecture, MicroStrategy, Snowflake, PyCharm, Python, Data Warehouse Design, BI Reporting, Data Processing Automation, SQL, Apache Airflow, Database Development, BI Reports, Database Architecture, Teradata, Scripting, Relational Database Design, Data Pipelines, Oracle Database, Oracle Data Integrator (ODI), Unix Shell Scripting, Analytics, Unix, Data Integration, Data Analytics, Redshift, Data Modeling, Modular Design, Agile Sprints, Scrum, Kanban, Database Modeling, Oracle BIP, Data Analysis, Dashboards, APIs, Microsoft Power BI, Data-level Security, Microsoft Excel, Microsoft Power Automate, Online Shops, Online Sales, MicroStrategy Visual Insight, Microsoft, Amazon Elastic MapReduce (EMR), AWS CLI, PySpark, AWS Lambda, Amazon Athena, Customer Analytics, Data Visualization, Interactive Dashboards, Web Dashboards, Azure, Azure Data Lake, Data Engineering, Reports, Reporting, Job Schedulers, Scheduling, SQL Performance, Teradata Databases, API Integration, Integration, Azure Data Factory, Azure SQL Data Warehouse, Dedicated SQL Pool (formerly SQL DW), Google Cloud Platform (GCP), Looker, Data Lakes, PostgreSQL, Data Migration, Database Migration, Stored Procedure, SQL Stored Procedures, Migration, Data Warehousing, Google Analytics, Google BigQuery, Startups, Dimensional Modeling, MySQL, Amazon Web Services (AWS), Cloud Migration, Business Intelligence (BI), Data Science, Automated Data Flows, MongoDB, Schemas, Jupyter Notebook, Metabase, Amazon RDS, Consulting, RDBMS, BigQuery, Matillion ETL for Redshift, Star Schema, Oracle Application Express (APEX), Azure SQL Databases, Fivetran, Database Schema Design, Data Build Tool (dbt), Pipelines, Big Data, Documentation, Design, Microsoft SQL Server, SQL Server Integration Services (SSIS), Data Reporting, Alteryx, Warehouses, ELT, NoSQL, Azure Databricks, Teamwork, Amazon QuickSight

Senior BI Developer

2015 - 2017
LIS Nepal Pvt. Ltd.
  • Developed and implemented end-to-end business intelligence solutions for various clients, utilizing a wide range of database technologies such as SQL Server, Oracle, MySQL, PostgreSQL, Teradata, Netezza, and Redshift.
  • Collaborated closely with business stakeholders to understand requirements and translate them into scalable data models and actionable insights, driving business growth and efficiency.
  • Implemented ETL processes using SSIS, Azure Data Factory, Talend, Python, and other tools, ensuring data integrity and consistency across multiple sources and systems.
  • Leveraged Power BI, Tableau, Looker, SSRS, Microstrategy, and other reporting tools to create interactive dashboards and reports, empowering stakeholders to visualize and analyze data effectively and make informed decisions.
  • Provided technical leadership and guidance to junior developers and cross-functional teams, fostering a culture of collaboration and innovation in BI solution development and implementation.
Technologies: Oracle, Teradata, SQL, Database Development, ETL, Scripting, BI Reports, Data Warehouse Design, Data Processing Automation, Relational Database Design, Data Pipelines, Oracle Database, Oracle Data Integrator (ODI), Unix Shell Scripting, Analytics, Unix, Data Integration, Data Analytics, Data Modeling, Data Architecture, BI Reporting, Database Modeling, Oracle BIP, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Data Analysis, Dashboards, Microsoft Excel, Online Shops, Online Sales, MicroStrategy Visual Insight, Microsoft, Customer Analytics, Data Visualization, Interactive Dashboards, Web Dashboards, Data Engineering, Reports, Reporting, Job Schedulers, Scheduling, SQL Performance, Teradata Databases, Integration, Google Cloud Platform (GCP), Looker, PostgreSQL, Data Migration, Database Migration, Stored Procedure, SQL Stored Procedures, Migration, Data Warehousing, Google BigQuery, Dimensional Modeling, MySQL, Cloud Migration, Business Intelligence (BI), Automated Data Flows, MongoDB, Schemas, Jupyter Notebook, Metabase, Amazon RDS, Consulting, Matillion ETL for Redshift, Star Schema, Oracle Application Express (APEX), Azure SQL Databases, Database Schema Design, Pipelines, Big Data, Documentation, Design, Microsoft Access, Microsoft SQL Server, SQL Server Integration Services (SSIS), Data Reporting, Warehouses, ELT, NoSQL, Teamwork, Amazon QuickSight

Software Developer

2013 - 2015
Yomari Inc.
  • Worked as level two batch support for Oracle Retail Analytics batch for multiple clients.
  • Developed ETL and history conversion scripts for Oracle Retail Analytics.
  • Created and validated OBIEE and BIP reports as per the requirements.
  • Developed custom Oracle Data Integrator packages for customer ETL integration with the Oracle ecosystem.
Technologies: Microsoft, Microsoft Excel, Unix Shell Scripting, Data, Data Analytics, Databases, SQL, Scripting, Reports, BI Reports, Reporting, Oracle Data Integrator (ODI), Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Oracle BIP, Oracle, Oracle Database, Data Visualization, Data Engineering, Job Schedulers, Scheduling, SQL Performance, Teradata Databases, Integration, Stored Procedure, SQL Stored Procedures, MySQL, Schemas, Amazon RDS, Consulting, Star Schema, Database Schema Design, Pipelines, Design, Microsoft Access, Microsoft SQL Server, Data Reporting, Teamwork

Azure-based Healthcare Application for Inventory Management

The Azure Healthcare Inventory Management project brought a transformative approach to healthcare inventory systems. The application enabled seamless data capture and integration by leveraging Azure's capabilities, optimizing inventory processes across multiple hospitals. As a data engineer, my significant contributions included introducing a robust Azure Cloud Solution that improved data processing speed by 30% and customized Power Apps that reduced errors by 25%.

Additionally, an automated data ingestion framework and enhanced reporting through Azure Power BI resulted in a 40% reduction in processing time and a 20% increase in data-driven decision-making. The streamlined Azure setup achieved a 15% reduction in operational costs, showcasing the tangible impact on efficiency, accuracy, and cost-effectiveness in healthcare inventory management.

Tableau for Customer Self Service Analytics

Successfully leading the re-platforming initiative, I migrated North America's and Europe's Excel-based customer and marketing reports from the Zeta platform to a more robust business intelligence (BI) solution. As the tech lead, I collaborated closely with business users, streamlining existing reports and gathering requirements for enhanced features. The project involved the development and deployment of six Tableau dashboards encompassing customer KPIs, demographics, and marketing reports, subsequently hosted on Tableau Online.

Moreover, this transition facilitated a sales-driven impact, providing stakeholders real-time insights into customer behavior and preferences. The implementation met business needs and empowered users with self-service capabilities, allowing them to make data-informed decisions on-demand. This evolution from Excel-based reports to easily accessible, standardized Tableau reports streamlined end-of-month/week reporting and drove sales efficiency by enabling data-driven decision-making at any time and from any location.

AWS Data Pipeline for Customer Analytics

As a solution architect and tech lead, my contributions to the AWS Data Pipeline for Customer Analytics project yielded significant impacts. I formulated an architectural design that met current needs and laid the groundwork for future scalability, resulting in a 20% improvement in system efficiency. Efficient communication of project decisions, risks, and issues ensured a transparent and collaborative development process, reducing project-related risks by 15%.

I led technical requirements gathering and integration design sessions and played a vital role in shaping the project's success, contributing to a 25% acceleration in the development timeline. The culmination of these efforts, spanning 8+ months, resulted in a centralized data foundation for customer analytics. This transformative initiative enabled the client to track customer key performance indicators (KPIs) in conjunction with sales and marketing KPIs, providing a comprehensive, single source of truth. The standardized marketing reporting platform further streamlined processes, fostering a more agile and strategic approach to future growth and customer-centric strategies, leading to a 20% increase in overall operational efficiency.

Real-time Sales Integration

Focused on enhancing visibility, this project addressed challenges faced by the client's store managers in efficient inventory management. The goal was to provide real-time inventory and sales reports, featuring key metrics like inventory in transit, allocation inbound, allocation outbound, stock on hand, store sales, lost sales, etc.

The system ingested sales and inventory data every 15 minutes, resulting in a 25% reduction in reporting latency. Leveraging Python, Snowflake, and Power BI, this streamlined pipeline ensured the timely generation and publication of hourly reports. Used Power BI Premium or Embedded capacity for refresh intervals, critical data security features boosted privacy, leading to a 20% increase in user confidence. This transformative project significantly improved store managers' visibility of available items, empowering proactive inventory management, informed decisions, and streamlining processes by 30%.

Omnichannel Reporting for Enterprise Data Warehouse

As the report technical lead developer in an Agile-based project, I focused on developing MicroStrategy reports atop the integrated enterprise data warehouse for omnichannel analytics. My role included designing and mapping report attributes and metrics, crafting detailed design documents, and collaborating with business users to transition from Excel-based reports. Internal meetings for planning, requirement gathering, and testing facilitated effective communication.

I implemented data security measures, safeguarding customer-sensitive data within MicroStrategy reports. Scheduled report distribution improved accessibility, resulting in a 20% increase in user efficiency. The project seamlessly transitioned to MicroStrategy, providing omnichannel key performance indicator (KPI) visibility on both mobile and web platforms. When users sought additional analysis, reports were delivered in Excel as part of the scheduled process, enhancing analytical experiences.

Data Warehouse Implementation Using Snowflake and Microsoft Azure

As an implementation lead consultant for the Robling retail analytics platform accelerator, I oversaw the deployment of the Robling data model across multiple US and APAC clients.

In technical execution, I played a key role in constructing a new data warehouse on the Snowflake database. Leveraging native Snowflake features, Azure Data Lake Storage, Snowpipes, Streams, and Azure tools, the project achieved a 30% enhancement in data processing speed. Custom integrations using Snowflake and report development on Tableau with Power BI addressed unique business analytical needs, leading to a 25% increase in report relevance.

Scheduled report distribution via various channels, including email, printer, and network drive, resulted in a 20% improvement in user accessibility. This comprehensive implementation empowered decision-makers, providing a profound understanding of business-driving factors. The documented technical design, production deployments, and functional data documents ensured clarity, contributing to a 30% increase in operational efficiency. The project significantly elevated analytical capabilities, with a 20% enhancement in data-driven decision-making for all stakeholders.

Python-based Accelerator Package for Teradata ODS

As a data architect and technical lead developer, I spearheaded the creation of a custom Python package to streamline the integration of diverse source system data into the Teradata operational data store (ODS) layer. The client faced challenges tracking and maintaining various tools, technologies, and processes for integrating OMS, WCS, RMS, and API data into the ODS layer.

In this role, I designed, developed, and tested data pipelines using Python for the ODS Teradata layer. I provided comprehensive documentation, including architecture diagrams, process flows, and technical design documents. Collaborating closely with the onsite project lead, I communicated technical issues, blockers, milestones, and plans, ensuring a cohesive development process.

Introducing the custom Python package empowered the client's tech team to incorporate new data sources with minimal effort effortlessly. Post-implementation, the maintenance overhead was significantly reduced by around 60%, showcasing the project's success in optimizing efficiency and simplifying the integration process for ongoing data sources.

Oracle Retail Analytics Migration Project

One SEA client sought to streamline their existing business process by migrating their current analytics to Oracle Retail Analytics (RA) for analytics automation and better visibility of business KPI in a more advanced analytics dashboard.

As a tech lead, I prepared the implementation plan and upgraded the strategy. Actively participated in data profiling, reports, and table rationalization. Prepared mapping document between the legacy source system and RA. Provided technical support on OBIEE and BI Publisher (BIP) report migration.

In addition, I prepared Oracle RA cutover strategies to align the cutover plan with other integrated systems. Put together a plan for data validation between the old and newer versions of systems and across the merchandising system.

Around 8 TB of data, nearly 200 legacy reports, and around 450 manual report schedules were migrated as part of this project. Rationalized 1500 tables in existing legacy to around 1000 tables in new RA system. All dbt transformations were moved to Oracle Data Integrator (ODI), and manual report schedules were automated. This reduced around 90% of the manual effort to send report emails to each department, store, buyer, and supplier.

Oracle Retail Analytics Implementation

Oracle offers Retail Analytics (RA) as part of the analytics for businesses that implement other Oracle Retail Merchandise Operations Management suites. This project was the implementation of RA for multiple US and APAC clients.

I was a project lead for implementing Oracle Retail Analytics for multiple clients. As part of the base implementation, I deployed the Oracle base data model and pre-built base reports and integration package in ODI.

As part of the customization and depending on business needs, I developed custom integration with source systems and custom reports on BIP, OBIEE, and Power BI. These reports were also scheduled to users via different channels like email, printer, network drive, and others.

In addition, the Oracle Apex app was created on top of its warehouse management system, which runs on the Oracle Database for scanning items and automatically generating correct prices and currency labels.

Merchandise Reclassification Support and Performance Tuning

This project was to support and build a custom integration for large volume reclassification in Oracle Retail Analytics (RA). The client implemented RA, and due to a change in the customer's business process, they had to re-classify a large volume of items in the merchandising system, which impacted the performance of RA jobs and all analytical reports.

As a developer and data analyst, I analyzed the system process and the bottleneck and prepared custom packages to improve the performance of reclassification processing in RA. I executed the reclassification process in analytics and built custom integration to build the validation process for as-is aggregation. I also added custom reporting views to enable as-is reporting.

The performance was improved by around 60% with custom integration packages for reclassification. Now, the client will also have visibility of As-Is vs. As-Was KPI performances.
2009 - 2013

Bachelor's Degree in Computer Engineering

Tribhuvan University - Kathmandu, Nepal

JULY 2022 - PRESENT

Core Designer Certificate

Dataiku

Languages

Snowflake, SQL, Python, Stored Procedure, Java, C#

Paradigms

ETL, Database Development, Modular Design, Dimensional Modeling, Business Intelligence (BI), Kanban, Scrum, Data Science, Azure DevOps

Platforms

Oracle Data Integrator (ODI), Microsoft, Oracle, Unix, Microsoft Power Automate, Azure, Azure SQL Data Warehouse, Google Cloud Platform (GCP), Amazon Web Services (AWS), Jupyter Notebook, Alteryx, Dedicated SQL Pool (formerly SQL DW), Dataiku, Oracle Database, AWS Lambda, Azure Functions, Microsoft Fabric, Azure Synapse, Azure PaaS

Storage

Data Pipelines, Data Integration, Database Modeling, Databases, RDBMS, Database Architecture, Teradata, Amazon S3 (AWS S3), Redshift, SQL Performance, Teradata Databases, Data Lakes, PostgreSQL, Database Migration, SQL Stored Procedures, MySQL, MongoDB, Azure SQL Databases, Microsoft SQL Server, NoSQL, Azure SQL, SQL Server Integration Services (SSIS)

Other

MicroStrategy, Data Warehouse Design, Scripting, Analytics, Data Analytics, BI Reporting, Data Architecture, Data Modeling, Data Analysis, Data, Reports, Reporting, Integration, Data Visualization, Interactive Dashboards, Data Engineering, Data Warehousing, Automated Data Flows, Schemas, Star Schema, Database Schema Design, Pipelines, Warehouses, ELT, Data Processing Automation, BI Reports, Relational Database Design, Unix Shell Scripting, Oracle BIP, Agile Sprints, Customer Analytics, Dashboards, APIs, Data-level Security, Job Schedulers, Scheduling, CSV, MicroStrategy Visual Insight, Online Sales, Online Shops, Web Dashboards, Azure Data Lake, Azure Data Factory, Data Migration, Migration, Cloud Migration, Amazon RDS, Consulting, Fivetran, Data Build Tool (dbt), Big Data, Documentation, Design, Data Reporting, Teamwork, Decision Support Systems, API Integration, Machine Learning, Google BigQuery, Startups, Metabase, Microsoft Azure, Azure Databricks, Verification

Libraries/APIs

PySpark

Tools

PyCharm, Apache Airflow, AWS Glue, AWS CLI, Amazon Elastic MapReduce (EMR), Amazon Athena, Tableau, Microsoft Power BI, Microsoft Excel, Looker, Google Analytics, BigQuery, Power Query, Microsoft Access, Amazon QuickSight, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Matillion ETL for Redshift, Oracle Application Express (APEX), Microsoft Power Apps, Azure DevOps Services

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring