Gaurav Kumar, Developer in Bengaluru, India
Gaurav is available for hire
Hire Gaurav

Gaurav Kumar

Verified Expert  in Engineering

Data Manager and Developer

Bengaluru, India

Toptal member since March 1, 2023

Bio

Gaurav brings 12+ years of solid experience in data governance, data stewardship, data quality, data management implementation methodology, and metadata management. He has a proven track record of operationalizing data governance and risk data aggregation programs, gathering hands-on experience in Collibra, Alation, Ab Initio Express>IT, Metadata Hub, SQL, Oracle, Jira, Confluence, and Tableau. Gaurav excels in leading and mentoring high-performing teams to execute enterprise-level strategies.

Portfolio

Commonwealth Bank of Australia
Data Governance, Data Management, Data Quality Management, Metadata, Collibra...
Wells Fargo
Data Governance, Data Quality Management, Data Management, Data Profiling...
Synechron
Data Quality, Data Quality Analysis, Quality Assurance (QA), Product Testing...

Experience

Availability

Part-time

Preferred Environment

Collibra, Data Governance, Data Management, Data Quality Management, Metadata, Data Stewardship, Agile, Data Quality, Stakeholder Engagement

The most amazing...

...experience I've had as a member of the data office team was playing an instrumental role in establishing data governance and data quality processes.

Work Experience

Data Governance Manager

2022 - PRESENT
Commonwealth Bank of Australia
  • Worked as a data governance and steward manager in the consumer finance data domain, leading and mentoring a high-performing team to execute enterprise-level strategies.
  • Spearheaded a team of six members from the data management implementation team, working on data quality controls, metadata management, data governance, and LOB risk controls while monitoring the processes within the enterprise.
  • Gained solid working experience in various data governance, data quality, data steward, metadata management, data lineage, data profiling, and data catalog supporting tools like Collibra, Alation, Ab Initio Express>IT, Metadata Hub, and Tableau.
  • Managed critical data elements from various data consumers by actively working and reviewing the data quality process, suggesting and creating data quality rules, business rules, data profiling, and data quality monitoring and reporting.
  • Served as a focal point of data management processes between data producers and consumers to enhance the governance and controls used across the LOB.
  • Contributed to stakeholder management initiatives collaborating with the L1 risk team for control decisions and governance needs about functional areas and participating and taking ownership of BMT/MBLTs with multiple groups to review and sign off.
  • Provided data practitioners and the broader data and analytics community with the knowledge, skills, and behavior to effectively adopt the new data engagement techniques and build data capability and expertise across the group.
  • Suggested and implemented multiple process improvements like the standardized data quality requirement template, introduced data profiling, and consolidated CDE tracker for better transparency and availability.
  • Took an active role in the data governance forum, actively participating in drop-in sessions, reviews, and monthly discussions with the functional general manager and executive managers for effort prioritization.
Technologies: Data Governance, Data Management, Data Quality Management, Metadata, Collibra, Risk, Ab Initio, SQL, Confluence, Data Stewardship, Agile, Data Classification, User Acceptance Testing (UAT), Root Cause Analysis, Data Quality, Stakeholder Engagement, Data Lineage, Data Profiling, Jira, Master Data Management (MDM), Ataccama, Data Engineering

Lead Data Management Consultant

2018 - 2022
Wells Fargo
  • Served as a data quality lead and SME in the commercial lending data domain, leading a team of seven members from the enterprise's data quality control implementation team.
  • Involved in requirement gathering, creating various artifacts, requirement analysis, design, implementation, support, and estimation of the work.
  • Collaborated with business stakeholders to understand and enhance the rule requirement for critical data elements (CDE).
  • Created control checks and validation procedures for business rules using Ab Initio Express>IT, with the results being published in the enterprise Metadata Hub and Collibra.
  • Contributed to data governance, quality, and profiling using Oracle SQL Developer and Aqua Data Studio.
  • Worked on detective controls, preventative controls, data governance, and data movement catalogs update and review processes.
  • Used Collibra to work on the data lineage, metadata management, and dictionary.
  • Delivered a Tableau report analysis of mismatches and gaps on the control checks in PROD vs. RSWB. Published two project ideas to save $33,900 annually.
  • Drove innovation by improving the existing SQL creation process, primary key identification procedures, release tagging, and test data management to enhance data quality.
  • Trained multiple DQ analysts and control analysts on cross-functional modules.
Technologies: Data Governance, Data Quality Management, Data Management, Data Profiling, Data Lineage, Metadata, Stakeholder Engagement, Collibra, Oracle, SQL, Tableau, Jira, Confluence, Ab Initio, Data Stewardship, Agile, Data Classification, User Acceptance Testing (UAT), Root Cause Analysis, Data Quality, Risk, Master Data Management (MDM), Ataccama, Data Engineering

Lead | Quality Control

2015 - 2018
Synechron
  • Served as a functional test lead on a business payroll systems project.
  • Created the test plan, test strategy, ambiguity review documents, test scenarios, test cases, test scripts, and multiple complex functionality process documents to streamline the gap in the regression repository.
  • Led the OpComp analysis effort for the regression suite to optimize more than 1,000 old test cases. Worked on fixing data quality issues and preventive controls.
  • Managed the automation factory effort for functional and non-automated regression suite, established the entry and exit criteria, and provided sign-off for the project.
  • Collaborated with companies to understand and document taxation. Extensively tested all taxation components and helped companies by creating complex queries run as part of the UAT tests.
  • Played a vital role in the BPS migration within the PHOENIX project by mapping ADP requirements. Interacted with ADP business analysts, converted the QA mapping fields to align ADP needs, and sent for WF business approval.
  • Used Microsoft SQL Server 2012 for database verification and back-end data validation.
  • Worked with external vendors like Automated Clearing House (ACH), Symmetry Software, MasterTax, and Avalara concerning tax payments and money movements from employer to employee via business payroll and tax agencies.
  • Trained multiple QA developers on cross-functional modules.
Technologies: Data Quality, Data Quality Analysis, Quality Assurance (QA), Product Testing, QA Test Plan Management, Testing Strategy, Testing, SAP Payroll, Functional Testing, Functional Analysis, Root Cause Analysis, Agile, Scrum Testing, Ab Initio, Oracle, SQL, Data Management, User Acceptance Testing (UAT), Stakeholder Engagement, Data Lineage, Data Profiling, Jira, Master Data Management (MDM)

Senior Software Engineer

2013 - 2015
Accenture
  • Assisted and drove the offshore test strategy from inception to completion and conducted the entire system integration testing with multiple teams from different streams.
  • Functioned as an SME of the upstream team for integration framework and flow model.
  • Built verification testing, functional testing, system integration testing, regression testing, user acceptance testing, back-end data validation, requirement analysis, and RCA processes.
  • Created high-level project requirements, provided feedback for FSD, and annotation of UI wireframes.
  • Helped the development team create an integrated unit test suite and scripted the following step test cases using Jasmine, a JavaScript framework.
  • Handled XML validation of payments and taxes by the payee system.
  • Owned multiple project modules, worked on end-to-end remediation, and created high-level test scenarios and detailed test scripts.
  • Conducted daily Scrum meetings to manage team status and challenges.
Technologies: Functional Testing, Manual Software Testing, QA Test Plan Management, Testing Strategy, Quality Assurance (QA), User Acceptance Testing (UAT), Oracle, SQL, Agile, Root Cause Analysis, Stakeholder Engagement

QA Analyst

2010 - 2013
Google
  • Conducted functional and manual testing, product testing, requirement analysis, root cause analysis, impact analysis, and country launch testing processes.
  • Contributed to Google Map Maker's 18 product version releases and 32 country launches.
  • Served as the key associate lead in pre and post-launches for several countries, coordinating with product managers, engineers, community teams, country managers, and localization PMs.
  • Handled 7+ in-house projects and processes, covering the operations team in India and USA.
  • Managed technical issues, like bugs and FRs, troubleshooting, and access level control for the Map Maker project, coordinating with developers, product managers, and cross-functional teams across the globe, including India, USA, Dublin, and Poland.
  • Provided improvement ideas, insights for machine learning use, and operations to engineers to provide high user engagement and better user experience.
  • Suggested multiple enhancements in the Google Map Maker UI, which were incorporated into the tool and are now in production.
  • Validated and prioritized global users and Ops escalations for Map Maker by conducting bug triage meetings with developers and product managers.
  • Created high-level test scenarios, process documents for system testing, and hundreds of test cases for the Map Maker application.
Technologies: Product Testing, Functional Testing, Manual QA, QA Testing, Software QA, Agile, Root Cause Analysis

Chief Data & Analytic Office DMIM | Data Management Chapter

Chief Data & Analytic Office (CDAO) established the data management chapter for different data domains within the bank, focusing on data governance and data quality processes. This required the involvement of multiple business stakeholders, including the risk and compliance team, to define data governance priority and the data quality controls on the critical data elements based on the risk management obligation as part of the regulatory and compliance authorities.

I worked as a data governance and steward manager in the consumer finance data domain to run the group's data management and governance processes.

Business Payroll System | Wells Fargo & Co.

Business Payroll System is a complete payroll solution of Wells Fargo & Co., designed for small scaled clients to manage payroll responsibilities. It's for Windows, Web, and Mobile for the clients ad its users, the employees.

I worked as the Business Payroll System projects' functional test lead.

Enterprise Data Quality Monitoring | eDQM

Data Management Insights (DMI) established the eDQM team to work with ABOs to assess existing controls against target requirements and develop the necessary detective data quality controls. ABOs or SMEs provided the business cases and rules, and the eDQM drove development using core toolsets.

I developed and deployed data quality practices in alignment with the enterprise data strategy, leveraging industry-wide standards and internal company policies. These practices enabled sound decision-making by improving data quality through consistent measurement and monitoring across authorized provision points (APPs) and APP sources.

Additionally, as a data quality lead and SME in the commercial lending data domain, I managed a team of seven in delivering and covering data quality controls.

Treasury Integration Project | Microsoft – Nokia

The treasury integration project aimed to develop a single, standard, and centralized set of systems for the treasury and payments department to handle payments, taxes, treasury, and payee onboarding for Microsoft- Nokia

Google Map Maker | Google Maps

Google Map Maker was one of the emerging and sought-after products, along with Google Maps and Google Earth, provided by the internet giant Google. Google Map Maker allows users to add and update geographic information for millions of users and show it on Google Maps and Google Earth. Once reviewed and approved, the user's updates appear online for people worldwide to see.

I was responsible for functional and manual testing, product testing, requirement analysis, root cause analysis, impact analysis, and country launch testing process.
2006 - 2010

Bachelor's Degree in Computer Science

ICFAI University - Dehradun, India

JANUARY 2023 - PRESENT

Alation Data Governance Advocate

Alation

SEPTEMBER 2020 - PRESENT

Collibra Expert II

Collibra University

JULY 2020 - PRESENT

Collibra Expert I

Collibra University

Tools

Collibra, Jira, Ab Initio, Confluence, SAP Payroll, Tableau

Paradigms

Functional Analysis, User Acceptance Testing (UAT), Testing, Functional Testing, Agile

Languages

SQL

Platforms

Oracle, Ataccama

Storage

Master Data Management (MDM)

Other

Data Management, Data Quality Management, Metadata, Data Profiling, Stakeholder Engagement, Data Stewardship, Data Governance, Computer Science, Risk, Data Lineage, Data Quality, Data Quality Analysis, Quality Assurance (QA), Product Testing, QA Test Plan Management, Testing Strategy, Root Cause Analysis, Scrum Testing, Manual Software Testing, Manual QA, QA Testing, Catalog Data Entry Services, Data Classification, Stakeholder Management, Software QA, Data Engineering

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring