
Nitesh Shankarlal Lohar
Verified Expert in Engineering
Data Engineering Developer
Ahmedabad, Gujarat, India
Toptal member since June 18, 2020
Nitesh is a data steward and technology leader with an accomplished and demonstrated history in data modeling, data warehousing, business intelligence, and business process reengineering. Nitesh is also quite comfortable developing, designing, and architecting solutions in the big data and data science landscape.
Portfolio
Experience
- Data Engineering - 10 years
- Snowflake - 6 years
- Databricks - 5 years
- Large Language Models (LLMs) - 2 years
Availability
Preferred Environment
Amazon Web Services (AWS), Snowflake, Databricks, Large Language Models (LLMs), Business Intelligence (BI)
The most amazing...
...project: accelerating time to market, enhancing flexibility, reducing complexity, and minimizing redundancy by introducing and managing dynamic workflow rules.
Work Experience
Data Architect
Ridgeant Technologies
- Streamlined and made advancements in the continuous improvement of data quality by implementing metrics-based decision making.
- Stabilized the applications and environment by driving logging, monitoring, analysis, access control limits, and test/stage tech stacks.
- Advised the corporate, sales, product, and legal teams in strategy, design, machine learning, process, data protection, and logical flow.
- Worked with every type of database from scratch to an experienced consultant.
- Used my strong background in data mining/analysis, probability, statistics, and computer science.
- Developed and deployed dashboards, visualizations, and autonomous and dynamic reporting interfaces to be distributed to stakeholders via the Power BI reporting platform, web portal, mobile, tablet devices, widgets, and email.
- Designed data transformations in Power BI and formulas using DAX to solve various complex KPIs.
Data Visualization Engineer
LegalWiz Analytics
- Gathered business requirements, definition and design of the data sourcing and data flows, data quality analysis, working in conjunction with the data warehouse architect on the development of logical data models.
- Derived data designs and identified opportunities to reengineer data structures, and recommended efficiencies as it related to data storage and retrieval.
- Developed a data warehousing-centered data analysis and data modeling strategic direction for the department.
- Participated as a senior data analyst—gathering information on source systems, processing logic, content, and operating system usage.
- Worked with business partners to understand business objectives and develop value-added reporting and provide ad-hoc data extracts and analysis.
- Developed and deployed dashboards, visualizations, and autonomous and dynamic reporting interfaces to be distributed to stakeholders via the Power BI reporting platform, web portal, mobile, tablet devices, widgets and email.
- Worked on advanced various analytics including predictive analytics, big data analytics, and visualization innovation in support of marketing and enterprise objectives; including advanced reporting like churn rate, LTV Reports, marketing dashboards, daily, weekly, and monthly KPI monitoring dashboards.
Experience
Bouqs
• Consulted and provided "value-added" insights in the form of analysis, interpretation, and advice on campaign reporting.
• Designed, executed, and measured high-impact direct-marketing experiments
• Continuously improved the campaign design and delivery by leveraging analytical tools and techniques.
Meundies
• Worked with business users and business analysts for requirements gathering and business analysis.
• Designed and customized data models for a data warehouse supporting data from multiple sources in real-time.
• Integrated various sources into the staging area in a data warehouse to integrate and cleanse data.
• Contributed to the build of a data warehouse which included the design of a datamart using star schema.
• Designed and developed CAC, LTV, and executive dashboards.
• Developed daily, monthly, and yearly data models and dashboards for reviewing day-to-day campaigns and optimizing campaign costs.
Itemize
• Administered and managed the entire development, QA, and production environment.
• Increased database performance by utilizing MySQL config changes, multiple instances, and hardware upgrades.
• Assisted with sizing, query optimization, buffer tuning, backup and recovery, installations, upgrades, and security including other administrative functions as part of the profiling plan.
• Ensured that the production data was being replicated into a data warehouse without any data anomalies from the processing databases.
• Worked with the engineering team to implement new design systems of databases used by the company.
• Created data extracts as part of data analysis and exchanged them with internal staff.
• Performed a MySQL replication setup and administration on Master-Slave and Master-Master.
• Documented all servers and databases.
Value IQ
In a load webshop data project, the client had his own system called front-controller which passed data in a flat-file. It also required an ETL job to read these files and transfer the data as per the business requirement before inserting it in a database.
In the load product/category project, the front-controller system uploaded product/category related information in a CVS file and an ETL job reads these files and stores data in a database after applying the business rules.
For housekeeping, I created an ETL job to clean the data from the database based on business rules.
Pingan Bank
• Worked with business users and business analysts for requirements gathering and business analysis.
• Customized Saiku to export to Excel in an optimized way and implement access management for users.
• Optimized Mondrian for a large number of users in Pentaho.
• Cached the Saiku reports using an ETL and Saiku APIs for performance tunning.
• Optimized cubes and redesigned it for better maintenance and performance.
• Implemented CDC on a Pentaho Application Server cluster.
• Created the design architecture of the system.
• Implemented an SSO and customized Saiku.
PepsiCo
• Analyzed, designed, and developed database objects and interfaces.
• Understood the business needs and implemented the same into a functional database design.
• Wrote PL/SQL procedures for processing business logic in the database.
• Tuned SQL queries for better performance.
• Composed Unix scripts to perform various routine tasks.
• Handled release preparations and customer communications on the help-desk system.
• Developed technical design documents and test cases.
• Provided customer support for various issues and change tickets.
Retail Product-based ERP Solution | SPEC INDIA
• Contributed to the requirements gathering and business analysis.
• Developed standard reports, cross-tab reports charts, and drill through Pentaho reports and dashboards.
• Generated and updated documentation of processes and systems.
• Performed unit testing and tuned for better performance.
• Was involved in user acceptance testing.
International News Agency | Australia
• Converted business requirements into high-level and low-level designs.
• Designed and customized data models for data warehouse supporting data from multiple sources in real-time.
• Integrated various sources into the staging area in a data warehouse to integrate and for cleansing data.
• Contributed to the building of a data warehouse which included a datamart design using star schema.
• Extracted data from different XML files, flat files, MS Excel, and web services.
• Transformed the data based on user requirements using Pentaho Kettle and loaded the data into the target by scheduling jobs.
• Worked on Pentaho Kettle, data warehouse design, transformations, and jobs.
• Developed Pentaho Kettle jobs and tuned the transformation for better performance.
LegalWiz Analytics
• Defined and designed the data sourcing and data flows.
• Analyzed data quality.
• Worked in conjunction with the data warehouse architect on the development of logical data models.
• Connected heterogeneous data sources like Google Analytics.
• Worked with the operational database, affiliate sites, and populated data to a centralized data warehouse using Talend Integration.
• Developed advanced analytics reports, e.g., customer lifetime values, churn rate, marketing dashboards, affiliate performance reports, and resource dashboards.
Education
Master's Degree in Computer Science
North Gujarat University - Patan, Gujarat, India
Bachelor's Degree in Computer Science
North Gujarat University - Patan, India
Certifications
SnowPro Advanced: Data Scientist
Snowflake
SnowPro Advanced: Architect
Snowflake
SnowPro Core Certification
Snowflake
R Developer
Coursera
MongoDB DBA Certification
MongoDB
Oracle Certified SQL Developer
Oracle
Pentaho Data Integration
Pentaho
Skills
Tools
Microsoft Excel, Pentaho Data Integration (Kettle), Microsoft Power BI, Domo, Tableau, Looker, AWS Glue, Periscope, Sisense, Jenkins
Languages
SQL, Snowflake, Python
Paradigms
Database Design, ETL Implementation & Design, Business Intelligence (BI), ETL
Platforms
Amazon Web Services (AWS), Pentaho, Amazon EC2, Talend, Linux, Oracle, Databricks
Storage
MySQL, Oracle 10g, Redshift, Oracle 11g, Microsoft SQL Server, PL/SQL
Other
Data Warehouse Design, ETL Development, Pentaho Dashboard, Pentaho Reports, Data Visualization, Data Engineering, Data Modeling, Excel 365, DAX, Finance, Informatica Intelligent Cloud Services (IICS), Large Language Models (LLMs), Artificial Intelligence (AI), Big Data, Multidimensional Expressions (MDX), Data Science, Computer Science
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.
Start hiring