Arun Kumar Basnet, Developer in Kathmandu, Bagmati Province, Nepal
Arun is available for hire
Hire Arun

Arun Kumar Basnet

Verified Expert  in Engineering

Data Specialist and Developer

Location
Kathmandu, Bagmati Province, Nepal
Toptal Member Since
August 1, 2022

Arun is a data engineer with four years of business intelligence and data warehousing expertise. His proficiency spans managing databases, crafting intricate SQL queries, leveraging cloud platforms, designing and executing ETL pipelines, and delivering impactful BI reports. Arun's extensive experience in reporting and data warehousing has enriched his skill set and honed his ability to excel within collaborative teams while maintaining exceptional client relationships.

Portfolio

Elite Cloud Ops Pvt Ltd.
ADF, SQL, PostgreSQL, Blob Storage, Azure, Python 3, Data Architecture...
LIS Nepal Pvt
Data Warehouse Design, Databases, Snowflake, Oracle, MicroStrategy...
Logitix - Main
Azure Data Factory, Azure, Snowflake, Data, SQL, Python, Data Engineering, ETL...

Experience

Availability

Part-time

Preferred Environment

Oracle Retail, Snowflake, Azure Data Factory, Databases, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Tableau, SQL, Data Warehousing, ADF, Python

The most amazing...

...thing I've done is designing the data mart for a client and the ETL framework and setting up the production environment with user-level security.

Work Experience

Lead Data Engineer

2023 - PRESENT
Elite Cloud Ops Pvt Ltd.
  • Developed a Python script to scrape the web platform for multiple websites, download the data, and upload it into Azure Blob Storage.
  • Created an ETL pipeline in Azure Data Factory to load the data from the Azure Blob Storage to PostgreSQL database.
  • Designed and developed an ETL pipeline to orchestrate the web-scrapping scripting and data loading in the Azure PostgreSQL database.
  • Designed and Implemented the database architecture for multiple datasets in Azure PostgreSQL database.
Technologies: ADF, SQL, PostgreSQL, Blob Storage, Azure, Python 3, Data Architecture, Modeling, Database Migration, ETL Pipelines, Database Lifecycle Management (DLM), Big Data, Data Science, Amazon S3 (AWS S3), Entity Relationships, BigQuery, GitHub, DAX, Data Transformation

Senior Software Engineer

2021 - PRESENT
LIS Nepal Pvt
  • Led an offshore support team for a retail client handling the deliverables. Used the Oracle BI suite, including OBIEE, BIP, ODI, and MFP.
  • Acted as a technical lead for a project delivering a new data model, data warehouse architecture, and the data load logic and automating it using Azure Data Factory and tasks in Snowflake.
  • Served as the subject matter expert (SME) on a project. Provided assistance to the team member on the project's issues and helped with the deliverables.
  • Developed a data warehouse for a client as per their requirement. Developed ETL scripts to load targets using a Snowflake procedure. Set up and maintained the production environment of a Snowflake data warehouse and implemented role-based access.
  • Analyzed Tableau reports and recreated Power BI reports related to existing ones in Tableau as part of the migration.
Technologies: Data Warehouse Design, Databases, Snowflake, Oracle, MicroStrategy, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Oracle BIP, ETL, ETL Tools, Integration, Reporting, Data Validation, Data Visualization, Business Intelligence (BI), Database Design, Oracle ODI, Microsoft Power BI, SQL, Data Warehousing, ADF, Data Analysis, Jira, Oracle Retail, Data Engineering, Azure, Python, Oracle Cloud, Google Sheets, Migration, SQL DML, Data Queries, SQL Performance, Dimensional Modeling, OLAP, Looker, Stored Procedure, SQL Stored Procedures, Data Migration, Scripting Languages, Apache Airflow, Data Build Tool (dbt), Data Modeling, Data Aggregation, BI Reporting, Database Development, Data Architecture, Dashboards, Qualitative Research, R, Modeling, Database Migration, ETL Pipelines, Database Lifecycle Management (DLM), Big Data, Data Science, Amazon S3 (AWS S3), Entity Relationships, Linux, DAX, Data Transformation

Data Engineer

2023 - 2023
Logitix - Main
  • Developed an application to handle daily data exports to clients based on in-house configuration tables. Integrated the Dagster framework to handle jobs and export data to various target destinations from the config tables to multiple clients.
  • Created an ETL job to extract the data from Spotify Web API and store the data in the Snowflake database. Developed a Dagster workflow to schedule and orchestrate the ETL pipeline to sync the data as per requirement.
  • Handled the orchestration and modification of the existing Azure Data Factory pipelines to optimize the pipeline executions.
Technologies: Azure Data Factory, Azure, Snowflake, Data, SQL, Python, Data Engineering, ETL, Apache Spark, Kibana, Dagster, Spotify API, R, Modeling, Database Migration, ETL Pipelines, Database Lifecycle Management (DLM), Big Data, Data Science, Entity Relationships, GitHub, Data Transformation

Software Engineer

2019 - 2021
LIS Nepal Pvt
  • Developed, tested, and monitored new integration to load customer data into RA from source CRM using KSH scripts. Designed, developed, tested, and deployed the entire ETL module.
  • Contributed to the history conversion of historical data from the legacy system, developed the ETL scripts for the daily batch, validated the data, and implemented Python script on the Snowflake database.
  • Performed the system implementation of BI on reporting side using the MicroStrategy (MSTR) reporting tool. Built schema objects, public objects, reports, and dashboards in MSTR.
Technologies: Data Warehousing, Databases, Snowflake, Oracle, MicroStrategy, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Oracle BIP, ETL, Reporting, Integration, Business Intelligence (BI), Data Visualization, SQL, Data Validation, Data Analysis, Data Warehouse Design, Jira, Oracle Retail, Data Engineering, Python, PostgreSQL, Oracle Cloud, Google Sheets, Azure Data Lake, Migration, SQL DML, Data Queries, SQL Performance, Sales, Sales Reports, Dimensional Modeling, OLAP, Looker, Stored Procedure, SQL Stored Procedures, Data Migration, Scripting Languages, Apache Airflow, Data Build Tool (dbt), Data Modeling, Data Aggregation, BI Reporting, Database Development, Azure, Data Architecture, Dashboards, R, ETL Pipelines, Big Data, Data Science, Entity Relationships, Linux

Associate Software Engineer

2018 - 2019
LIS Nepal Pvt
  • Used the Oracle BI retail suite, RA and RI, acquiring functional and technical knowledge in the ODI, OBIEE, BIP, and shell scripts.
  • Contributed to the Oracle Retail XStore POS solution. Used Java as a programming language, modified existing features per requirements and added new features.
  • Acted as a support developer working on the OBIEE reporting, report development, and metadata modification of RPD files in OBIEE. Provided support for the reports' schedules and agents.
Technologies: Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Oracle BIP, Java, Oracle, Snowflake, MySQL, SQL, Data Warehousing, Data Validation, Oracle Retail, Data Engineering, Pandas, Google Sheets, Migration, SQL DML, Data Queries, SQL Performance, Sales, Sales Reports, OLAP, Stored Procedure, SQL Stored Procedures, Data Migration, Scripting Languages, Apache Airflow, Data Build Tool (dbt), Data Modeling, Data Aggregation, BI Reporting, Data Architecture, Data Science, Entity Relationships, Linux

CRM to Data Warehouse Integration

The project's scope was to develop an ETL module to load the customer data into the data warehouse from the source.

I solely worked on the design of the ETL module, developed the load logic based on mapping documents, handled the SCD changes of the customer dimension, and validated the data. I also provided batch support for the new module.

Timesheet Data Mart

An ETL integration to source the Excel file and load it into the Snowflake database.

I used Azure Data Factory to orchestrate the extraction process and Snowflake procedures and tasks to handle the loading part and schedule it on a daily basis. Per the reporting needs and requirements, I developed various reporting views and enhanced the project by orchestrating the entire data flow from the Azure Data Factory. I also developed a module to notify through email in case of batch failure.

I also developed new Power BI reports and dashboards on top of the data warehouse concerning existing Tableau reports which were built on top of Excel files as part of the report migration.

Project Migration from On-premise to Cloud

I contributed to the BIP and OBIEE report migration from on-premise Oracle 11g suite to cloud ADW Oracle 12c. It included migration of the reports and schedules using back-end tables, validation, and deployment of the schedules in the production environment on different phases.

MicroStrategy (MSTR) Reporting

An MSTR-based BI reports development project.

I worked as a report developer for a client, developing the MSTR schema objects, public objects, report datasets, and dashboards. I also created the deployment packages to deploy the report objects across different environments.

Scheduler Tool for Data Export

As a back-end developer, I built an application to handle daily data exports to multiple clients based on the in-house configuration tables. I used the Dagster (open source) framework to handle scheduling a cron job and exporting the data based on the configurations and destinations from the config tables.

Spotify Data Extractor

I developed an ETL job to extract the artist data from Spotify Web API and store the data in the Snowflake database with the artist metadata and details.
I also created a Dagster workflow to schedule and orchestrate the ETL pipeline to sync the data as per requirement.
2014 - 2018

Bachelor's Degree in Computer Engineering

Kathmandu University - Dhulikhel, Kavre, Nepal

MARCH 2023 - MARCH 2025

SnowPro Core Certification

Snowflake

NOVEMBER 2022 - PRESENT

Microsoft Certified: Azure Data Fundamentals

Microsoft

NOVEMBER 2022 - PRESENT

Microsoft Certified: Azure Fundamentals

Microsoft

Libraries/APIs

Pandas, REST APIs, Spotify API

Tools

Microsoft Power BI, Jira, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Tableau, GitHub, Google Sheets, Looker, Apache Airflow, Google Forms, Kibana, BigQuery

Languages

Snowflake, SQL, Python, SQL DML, Java, Stored Procedure, T-SQL (Transact-SQL), Python 3, C++, C, R

Storage

Databases, Data Integration, Data Validation, PostgreSQL, Oracle Cloud, SQL Performance, SQL Stored Procedures, Database Migration, Database Lifecycle Management (DLM), Amazon S3 (AWS S3), MySQL, Azure Blobs, Data Pipelines, Azure SQL

Paradigms

ETL, OLAP, Data Science, Business Intelligence (BI), Dimensional Modeling, Database Development, Requirements Analysis, Database Design, Oracle ODI, Qualitative Research

Frameworks

ADF, Apache Spark

Platforms

Oracle Retail, Oracle, Azure, Linux, Oracle Database, Oracle Data Integrator (ODI), Azure PaaS, Amazon Web Services (AWS), Docker

Other

Oracle BIP, Data Warehouse Design, Data Engineering, Data Queries, Data Migration, BI Reporting, Data Architecture, Dashboards, ETL Pipelines, Entity Relationships, Data Extraction, ETL Tools, Data Warehousing, Data Analysis, Data Visualization, Integration, Reporting, Data Analytics, Migration, Performance Tuning, Sales Reports, ELT, Scripting Languages, Data Build Tool (dbt), Data Modeling, Data Aggregation, Modeling, APIs, DAX, Data Transformation, Azure Data Factory, MicroStrategy, Shell Scripting, ETL Testing, BI Reports, Programming, Computer Graphics, Networking, Data Structures, Algorithms, Machine Learning, Artificial Intelligence (AI), Azure Data Lake, Sales, Cloud, Blob Storage, Data, Dagster, Big Data

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring