Verified Expert in Engineering
Vipen is a highly experienced database warehouse, development, and performance professional with comprehensive knowledge of OLAP and OLTP databases. Vipen has worked as a data architect, data modeler, ETL developer, database developer, and data analyzer utilizing Vertica, Teradata, Netezza, Oracle, and SQL Server, as well as Snowflake and Python.
Multi-dimensional Databases, Git, Windows
The most amazing...
...things I've done were design multiple management systems and resolve performance issues involving various database technologies.
- Developed schema for calculating and displaying various school scores for the students' races. Missing data objects could be added as empty data objects to ease the uniform type of processing.
- Built procedures to ingest/transform/calculate various data points for the web front. Slowly changing dimensions were made while processing the data.
- Encountered several challenges related to the schema inconsistency between different data releases. The process was designed to add extended data at any time.
Senior Data Engineer
Leadgen Data LLC
- Set up database schema and stored procedure executions via table trigger to transform data into relevant dimensions.
- Designed this schema to improve the fast loading of the data and the relevant transformations to finish in the allotted time.
- Worked on data sharding design on Citus to support quick data processing on multiple nodes and faster query execution.
DBA for Health Technology Company
UpHealth Inc. - IT
- Worked on a PostgreSQL replication and solved PostgreSQL and Microsoft SQL Server performance issues.
- Tracked and fixed performance and code bugs via Jira.
- Converted processes from Microsoft SQL Server to PostgreSQL. Tuned monthly processing and dashboard reports.
- Designed a data engineering solution to curate data from the Department of Education to be reported via a front-end website.
- Joined data from different data sources to form coherent information.
- Streamlined the process so that future data can be added at will. Data is accessed with excellent performance.
- Kept the performance in mind while designing the solution.
- Completed the project in a short time with full documentation.
Database Migration Engineer
- Converted physical machines to virtual ones. Jumped through many hoops to bring the application/thick client along. Did a lift and shift to have a disaster solution available in case of catastrophic failure of the original machines or their drives.
- Exported and migrated Oracle 10g to the latest Oracle version. Analyzed the access and roles associated with the application/thick-client software. Established a seamless integration between the database and the application.
- Coordinated with the existing support teams to get the testing done before making it production-ready.
DBA Engineer for Threat Analytics and Hunting Platform
- Performed in-depth analysis of query patterns and query rewrite for performance reasons.
- Conducted in-depth server analysis, load, and tweaks for optimal MySQL functioning.
- Designed partitioning on large data tables, did index analysis, removed redundant indexes, and provided SQL scripts for automated partitioning and transforming big monolithic tables to partitioned tables, data movement, and index building.
SQL Azure DBA for Automotive System
Department 210, LLC
- Analyzed the current schema design and refactored database objects/stored procedures and functions for maximum efficiency and lower cloud cost on Azure.
- Analyzed query plans for various complex joins involving medium and very large tables, keeping pace with the development team for new code deployments and preempting any potential performance problems.
- Did firefighting with performance issues and troubleshooting, isolating bad query plans to take immediate action on the fly.
Senior MySQL Engineer
Martingale Media, LLC
- Analyzed existing queries, structures, and data and provided feasible solutions for deep analytics.
- Provided solutions on hot queries with large datasets by refactoring the queries and tables for performance.
- Engineered solutions for the edge cases where simple tuning was not working or couldn't be crafted.
DB Performance Engineer
- Analyzed data demographics and handled performance tuning, data re-factorization, data deduping, and space-saving.
- Conducted what-if analysis of performance and space usage under the assumption of users in the millions.
- Reduced space usage by 80% across the board on very large tables with a complex and dynamic SQL procedure built for conditional data de-duping.
Postgres DB Performance Engineer
- Performed performance analysis on the application and the Postgres DB. Re-factored the tables, data, and indices. Achieved excellent SQL query performance with sub-second execution.
- Advised on overall best practices for data storage.
- Tracked and resolved existing bugs using a ticketing platform.
- Validated results and performance metrics while the website was in action with full and stressed loads.
Snowflake Data Engineer
- Migrated ETL across platforms from Oracle Data Integrator to DBT + Snowflake.
- Developed the test cases for data validation following best practices for DBT and Snowflake. Performance-tuned the new ETL.
- Tracked and successfully fixed bugs found within the system.
Telecommunication & Content Provider
- Supported database environments, data warehousing, data cleaning, backups/recovery, access management, and reporting, ensuring user access compliance with company policies.
- Managed performance tuning and upgrades of MPP and cloud DBs. Led full-scale testing of newer upgrades and roll-out after deep analysis of performance/features/user connectivity and user acceptance.
- Developed tools to augment testing, equipping the offshore team to provide effective off-hours support for clients and loads. Mentored junior team members to enable them to handle critical situations with a fair amount of ease.
- Designed data collection and management reporting systems around database usage and ETL SLAs, forming a basis for cost center billing of respective departments. Designed and developed ETLs in Informatica and other tools.
- Handled clickstream data and data lakes. Developed processes to efficiently transfer terabytes of data between old and new databases during the cross-database migration.
- Analyzed query plans across various database technologies and built solutions with efficient, hands-on SQL and performance tuning. Augmented the DB system with additional performance metrics of queries to tune various DB parameters.
- Wrote complex SQL to handle large data extraction, analysis, and uploading. Handled hierarchical data and output flat datasets and vice versa.
- Analyzed data and did dimensional data modeling for large objects with terabytes of data. Geared logical and physical design separation toward performance and ease of development. Created a unified data model.
- Worked on data mapping between different data sources for matching, lineage, and consistency. Maintained data marts for reporting and support for data feeding to downward systems. Built lineage logic between distributed data sets from various feeds.
- Managed the Snowflake database and users, built pipelines and maintained slowly changing dimensions, and handled dimensional modeling.
- Contributed to bug fixing and enhancements on the database side to support user functions in the web application.
- Created the web application while keeping in mind the retail industry functions and requirements. The application is currently being used by jewelry vendors.
- Designed the UI so it can be built on any stack and connected to any DB technology which supports stored procedures in DB or at the application tier level.
- Reduced page response times to less than three seconds and ensured growth would not negatively impact response times.
- Refactored a clinical auditing data system in regards to data structures to make it very easy for the UI to show what changed at the column level.
- Replaced the older partitioning scheme with a new composite equipartitioning scheme for faster retrieval of records on a data set with TBs of data.
- Wrote complex SQL for handling a large XML dataset for analysis and uploading with native database technologies.
- Created performant queries/programs to analyze hierarchical data for a trail on auditing data.
- Performed migration and switch from older clinical trial systems data to new systems. Gathered requirements for the work assessment.
- Created data mapping scripts for successful initial data transfer as per the new system requirement from older systems. The process was designed to ease the migration from any system.
Lead Software Engineer
- Led application/database/SQL performance tuning. Put major effort into modifying ASPEN written in XSLT to generate a performance-oriented SQL signature for Oracle and SQL Server. It was incorporated and used for future product development.
- Periodically reviewed SQL/Process for refactoring. Made the application better with less code running and implemented new DB features—for example, replaced a recursive process with a CONNECT BY PRIOR clause to do the same task.
- Tested new releases and incorporated new enhancements into the product as a contributing member of the engineering group that owned the product's new direction and roadmap.
- Provided many useful fixes for the overall health of the product in our data center servers.
- Resolved performance issues around web servers, DB servers, and applications with live client sessions. Identified the new hot areas for potential problems and contributed to pre-sales stress testing.
- Wrote SOP for best practices around the LMS application usage, how to troubleshoot, what to gather, what to look for, and how to document it for routing it to sustained engineering.
- Provided expert guidance and fixes for the data synchronization tool to run in under a couple of hours instead of days. Earned power-of-one and other awards during my employment.
- Designed queries for handling hierarchical data and output flat datasets and vice versa.
- Developed software solutions for power companies using Oracle advanced queues. Worked on modules to determine the optimal power throughput through the transmission lines.
- Led Oracle SQL and PL/SQL performance tuning. Optimized packages and procedures.
- Used the Oracle queues to implement a data pipeline with fault tolerance.
Lead Software Engineer
State of Washington
- Gathered requirements for reporting needs. Performed initial data analysis and gauged the work needed to provide the reporting solution. Set up data structures and cleanup needed for the data collection.
- Modeled a reporting database and designed management reports, architected the data warehouse for the management reporting database, and collected data and optimized reports.
- Triaged the reporting issues around missing data, data volume changes, and performance.
- Managed performance tuning issues arising out of the reports from time to time with database and post-upgrade features as well as user change requirements. Augmented existing reports with additional datasets for management.
- Handled day-to-day Oracle database administration. Monitored the database and made sure it was working optimally. Performance-tuned the database, application, and queries.
- Designed a user access security system for the web-based application for medical publications. The security system was very easy to follow and it had a tiered level of privileges.
- Analyzed data structures, provided code reviews, and tested SQL and PL/SQL in a pre-production environment for sizeable data to have better confidence in moving the deployment to production.
- Implemented an Oracle database replication. The replication was a simple master-to-offsite replication.
- Supported the credit analysis of product development for companies.
- Developed a complex Oracle PL/SQL procedure for doing financial what-if analysis for determining credit scores and limits for companies.
- Created processes for building complex financial models.
- Worked with customers for a successful transition from old systems with data transfers and complex data migration.
- Oversaw SQL performance tuning for the data transfer interfaces written for loading data from operational systems to application tables for analysis.
- Provided solutions for performance issues arising out of the product for customers.
- Helped with data mapping between different data sources for matching, lineage, and consistency.
- Worked on the migration of data from older applications and flat files for successful customer adoption of the new product.
Performance Analysis and Tuning
Tuning Nightly Loads
I maintained data loading SLAs and resolved data conflicts and operational issues.
Netezza to Vertica Migration
Enhancing Starlims Application
I replaced the original table partitioning with new enhanced composite partitioning by inventing an equipartitioning formula for the average number of rows inside each partition, and I flattened the XML data and carved out the changes in columns between the two adjacent records' XML data, subsequently storing it in UI tables for easy reporting.
Purchase/Materials/Financial Management System
Financial What-if Analysis for Credit Check for Companies
Stored Procedure, SQL, Bash, T-SQL (Transact-SQL), Python, Snowflake, C, Active Server Pages (ASP), PHP, XML
Cron, Toad, Spotfire, Amazon Athena, Microsoft Access, AWS Glue, MySQL Performance Tuning, BigQuery, Teradata SQL Assistant, Informatica ETL, DbVisualizer, Erwin, Turbo Pascal, Excel 2013, Subversion (SVN), Git, Citus
ETL, Database Design, Dimensional Modeling, Refactoring, OLAP, Test-driven Development (TDD), Application Architecture
Oracle, Oracle Database, KornShell, Amazon Web Services (AWS), Google Cloud Platform (GCP), Linux, Windows, Azure, Apache Kafka
Database Architecture, DB, Database Modeling, Databases, SQL Stored Procedures, Relational Databases, Database Transactions, Database Triggers, Data Pipelines, Oracle SQL, Oracle 12c, PL/SQL, Oracle RDBMS, SQL Performance, Database Performance, Vertica, Netezza, Oracle PL/SQL, MySQL, OLTP, Microsoft SQL Server, PostgreSQL, Database Administration (DBA), SQL Server 2017, Redshift, Oracle Multitenant, MongoDB, Amazon S3 (AWS S3), Azure SQL, Teradata, SQL Server DBA, MariaDB, Database Lifecycle Management (DLM), Cassandra, MemSQL, SQL Server Integration Services (SSIS), Amazon Aurora, SQL Server 2005, Dynamic SQL, Distributed Databases
Informatica, Star Schema, Data Engineering, Database Optimization, Database Table Optimization, Query Optimization, Data Warehouse Design, Big Data, Data Architecture, Architecture, ETL Pipelines, PL/SQL Tuning, Performance Tuning, ETL Development, Data Warehousing, Data Modeling, Data Analysis, Complex Data Analysis, Data Cleaning, Data Cleansing, Data Migration, Reverse Engineering, Data Analytics, Database Schema Design, Reporting, Integration, Server Migration, IBM PureData System for Analytics, Data Transformation, Data Management, Data Marts, Debugging, Electronic Data Interchange (EDI), ETL Tools, Google BigQuery, Computer Science, Mathematics, Physics, Data Auditing, Apache Cassandra, Oracle OEM 12c, Analytics, Big Data Architecture, SingleStore, Amazon RDS, Microsoft Data Transformation Services (now SSIS), Data Build Tool (dbt), Partitioning, Relational Database Design, Oracle Forms & Reports, Snowpipe, Data Loading, SQL Server 2019, Indexing, Performance Analysis, Data Scientist, Real-time Data
Presto DB, .NET
Master of Computer Applications Degree in Computer Science
Himachal Pradesh University - Shimla, India
Master of Science Degree in Mathematics
University of Jammu - Jammu, India
Bachelor of Science Degree in Physics, Chemistry, Mathematics, English
University of Jammu - Jammu, India
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.Start hiring