Nitesh Shankarlal Lohar, Developer in Ahmedabad, Gujarat, India
Nitesh is available for hire
Hire Nitesh

Nitesh Shankarlal Lohar

Verified Expert  in Engineering

Data Engineering Developer

Location
Ahmedabad, Gujarat, India
Toptal Member Since
June 18, 2020

Nitesh is a data steward and technology leader with an accomplished and demonstrated history in data modeling, data warehousing, business intelligence, and business process reengineering. Nitesh is also quite comfortable developing, designing, and architecting solutions in the big data and data science landscape.

Portfolio

Ridgeant Technologies
MySQL, Oracle, Microsoft SQL Server, Redshift, Big Data, Domo...
LegalWiz Analytics
Redshift, Microsoft SQL Server, Microsoft Power BI, Tableau

Experience

Availability

Part-time

Preferred Environment

Amazon Web Services (AWS), Snowflake, Databricks, Large Language Models (LLMs), Business Intelligence (BI)

The most amazing...

...project: accelerating time to market, enhancing flexibility, reducing complexity, and minimizing redundancy by introducing and managing dynamic workflow rules.

Work Experience

Data Architect

2012 - PRESENT
Ridgeant Technologies
  • Streamlined and made advancements in the continuous improvement of data quality by implementing metrics-based decision making.
  • Stabilized the applications and environment by driving logging, monitoring, analysis, access control limits, and test/stage tech stacks.
  • Advised the corporate, sales, product, and legal teams in strategy, design, machine learning, process, data protection, and logical flow.
  • Worked with every type of database from scratch to an experienced consultant.
  • Used my strong background in data mining/analysis, probability, statistics, and computer science.
  • Developed and deployed dashboards, visualizations, and autonomous and dynamic reporting interfaces to be distributed to stakeholders via the Power BI reporting platform, web portal, mobile, tablet devices, widgets, and email.
  • Designed data transformations in Power BI and formulas using DAX to solve various complex KPIs.
Technologies: MySQL, Oracle, Microsoft SQL Server, Redshift, Big Data, Domo, Microsoft Power BI, SQL, Informatica Intelligent Cloud Services (IICS), Data Engineering, Data Modeling, PL/SQL, ETL

Data Visualization Engineer

2019 - 2020
LegalWiz Analytics
  • Gathered business requirements, definition and design of the data sourcing and data flows, data quality analysis, working in conjunction with the data warehouse architect on the development of logical data models.
  • Derived data designs and identified opportunities to reengineer data structures, and recommended efficiencies as it related to data storage and retrieval.
  • Developed a data warehousing-centered data analysis and data modeling strategic direction for the department.
  • Participated as a senior data analyst—gathering information on source systems, processing logic, content, and operating system usage.
  • Worked with business partners to understand business objectives and develop value-added reporting and provide ad-hoc data extracts and analysis.
  • Developed and deployed dashboards, visualizations, and autonomous and dynamic reporting interfaces to be distributed to stakeholders via the Power BI reporting platform, web portal, mobile, tablet devices, widgets and email.
  • Worked on advanced various analytics including predictive analytics, big data analytics, and visualization innovation in support of marketing and enterprise objectives; including advanced reporting like churn rate, LTV Reports, marketing dashboards, daily, weekly, and monthly KPI monitoring dashboards.
Technologies: Redshift, Microsoft SQL Server, Microsoft Power BI, Tableau

Bouqs

Role: BI and Data Architect

• Consulted and provided "value-added" insights in the form of analysis, interpretation, and advice on campaign reporting.
• Designed, executed, and measured high-impact direct-marketing experiments
• Continuously improved the campaign design and delivery by leveraging analytical tools and techniques.

Meundies

Role: Data Analyst

• Worked with business users and business analysts for requirements gathering and business analysis.
• Designed and customized data models for a data warehouse supporting data from multiple sources in real-time.
• Integrated various sources into the staging area in a data warehouse to integrate and cleanse data.
• Contributed to the build of a data warehouse which included the design of a datamart using star schema.
• Designed and developed CAC, LTV, and executive dashboards.
• Developed daily, monthly, and yearly data models and dashboards for reviewing day-to-day campaigns and optimizing campaign costs.

Itemize

Role: Database Administrator (DBA)

• Administered and managed the entire development, QA, and production environment.
• Increased database performance by utilizing MySQL config changes, multiple instances, and hardware upgrades.
• Assisted with sizing, query optimization, buffer tuning, backup and recovery, installations, upgrades, and security including other administrative functions as part of the profiling plan.
• Ensured that the production data was being replicated into a data warehouse without any data anomalies from the processing databases.
• Worked with the engineering team to implement new design systems of databases used by the company.
• Created data extracts as part of data analysis and exchanged them with internal staff.
• Performed a MySQL replication setup and administration on Master-Slave and Master-Master.
• Documented all servers and databases.

Value IQ

Role: ETL/BI Developer

In a load webshop data project, the client had his own system called front-controller which passed data in a flat-file. It also required an ETL job to read these files and transfer the data as per the business requirement before inserting it in a database.

In the load product/category project, the front-controller system uploaded product/category related information in a CVS file and an ETL job reads these files and stores data in a database after applying the business rules.

For housekeeping, I created an ETL job to clean the data from the database based on business rules.

Pingan Bank

Role: BI/Big Data Developer

• Worked with business users and business analysts for requirements gathering and business analysis.
• Customized Saiku to export to Excel in an optimized way and implement access management for users.
• Optimized Mondrian for a large number of users in Pentaho.
• Cached the Saiku reports using an ETL and Saiku APIs for performance tunning.
• Optimized cubes and redesigned it for better maintenance and performance.
• Implemented CDC on a Pentaho Application Server cluster.
• Created the design architecture of the system.
• Implemented an SSO and customized Saiku.

PepsiCo

Role: Oracle Database Adminstrator and Developer

• Analyzed, designed, and developed database objects and interfaces.
• Understood the business needs and implemented the same into a functional database design.
• Wrote PL/SQL procedures for processing business logic in the database.
• Tuned SQL queries for better performance.
• Composed Unix scripts to perform various routine tasks.
• Handled release preparations and customer communications on the help-desk system.
• Developed technical design documents and test cases.
• Provided customer support for various issues and change tickets.

Retail Product-based ERP Solution | SPEC INDIA

Zoom is a retail industry product which has an interactive back office reports to analyze sales, forecasting sales, product demands and region sales, performing salesman and area wise sale, and more.

• Contributed to the requirements gathering and business analysis.
• Developed standard reports, cross-tab reports charts, and drill through Pentaho reports and dashboards.
• Generated and updated documentation of processes and systems.
• Performed unit testing and tuned for better performance.
• Was involved in user acceptance testing.

International News Agency | Australia

• Worked with business users and business analysts for requirements gathering and business analysis.
• Converted business requirements into high-level and low-level designs.
• Designed and customized data models for data warehouse supporting data from multiple sources in real-time.
• Integrated various sources into the staging area in a data warehouse to integrate and for cleansing data.
• Contributed to the building of a data warehouse which included a datamart design using star schema.
• Extracted data from different XML files, flat files, MS Excel, and web services.
• Transformed the data based on user requirements using Pentaho Kettle and loaded the data into the target by scheduling jobs.
• Worked on Pentaho Kettle, data warehouse design, transformations, and jobs.
• Developed Pentaho Kettle jobs and tuned the transformation for better performance.

LegalWiz Analytics

• Gathered business requirements.
• Defined and designed the data sourcing and data flows.
• Analyzed data quality.
• Worked in conjunction with the data warehouse architect on the development of logical data models.
• Connected heterogeneous data sources like Google Analytics.
• Worked with the operational database, affiliate sites, and populated data to a centralized data warehouse using Talend Integration.
• Developed advanced analytics reports, e.g., customer lifetime values, churn rate, marketing dashboards, affiliate performance reports, and resource dashboards.

Languages

SQL, Snowflake, Python

Tools

Microsoft Excel, Pentaho Data Integration (Kettle), Microsoft Power BI, Domo, Tableau, Looker, AWS Glue, Periscope, Sisense, Jenkins

Paradigms

Database Design, ETL Implementation & Design, Business Intelligence (BI), ETL

Platforms

Amazon Web Services (AWS), Pentaho, Amazon EC2, Talend, Linux, Oracle, Databricks

Storage

MySQL, Oracle 10g, Redshift, Oracle 11g, Microsoft SQL Server, PL/SQL

Other

Data Warehouse Design, ETL Development, Pentaho Dashboard, Pentaho Reports, Data Visualization, Data Engineering, Data Modeling, Excel 365, DAX, Finance, Informatica Intelligent Cloud Services (IICS), Large Language Models (LLMs), Big Data, Multidimensional Expressions (MDX)

2007 - 2012

Master of Science Degree in Computer Science

North Gujarat University - Patan, India

JULY 2022 - JULY 2024

SnowPro Advanced: Architect

Snowflake

JANUARY 2022 - JANUARY 2024

SnowPro Core Certification

Snowflake

AUGUST 2016 - PRESENT

R Developer

Coursera

JULY 2016 - PRESENT

MongoDB DBA Certification

MongoDB

JULY 2015 - PRESENT

Oracle Certified SQL Developer

Oracle

MAY 2015 - PRESENT

Pentaho Data Integration

Pentaho

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring