Vijay Kumar Vadlamudi, Developer in Swindon, United Kingdom
Vijay is available for hire
Hire Vijay

Vijay Kumar Vadlamudi

Verified Expert  in Engineering

Data Migration Developer

Location
Swindon, United Kingdom
Toptal Member Since
June 18, 2020

Over the last two decades, Vijay has amassed significant knowledge in multiple functional domains. He's successfully mapped complex business processes to products and solutions that are available in the market which enabled enterprises to migrate and adopt them to realize their goals. He's an analytical thinker, continuous learner, and strives to solve business problems.

Availability

Part-time

Preferred Environment

Toad, Apache Hive, PL/SQL, SQL

The most amazing...

...data model I've designed and built is highly responsive and enabled core APIs to provide a <100 millisecond response 98% of the time.

Work Experience

Senior Data Engineer

2005 - 2019
Cisco Systems India, Pvt. Ltd.
  • Migrated the customer master from Oracle trading community to a custom model and improved the quality of data by 18%, which translated to a 6% increase in visibility to service contract renewal opportunities.
  • Migrated legacy features included sales territory management and incentive compensation to Oracle-based technology, which helped standardize the processes globally and enabled the decommissioning of many regional tools and inconsistent practices.
  • Migrated case management tools and apps into a single unified CRM app for efficient case routing to the appropriate support team, thus significantly reducing issue resolution time.
  • Consolidated transaction adjustment tools into a single tool to improve the productivity of the business operations team.
  • Built canned BI reports and dashboards so that sales agents could view their pipeline in actual vs. target performance.
  • Wrote Python and Shell scripts to read and process text and data, crawl and parse websites, and run other scripts for predictive machine learning and other purposes.
  • Analyzed customer data and sales transactions to identify patterns among duplicate customers in the repository as well as root-cause the sales credits assigned inaccurately to sales agents.
  • Built a process to periodically read transactions captured in MongoDB and write them to Oracle Database.
  • Tuned SQL for performance in things like stored procedures and a complex reporting query. Created and modified indices on tables, and re-wrote SQL queries to achieve better performance.
Technologies: Shell, Hadoop, Python, Apache Hive, SQL

IT Engineer

2000 - 2005
Cisco Systems, Inc.
  • Customized Oracle application modules to support evolving business requirements.
  • Built Waterfall reports that get refreshed weekly to show actual bookings vs. forecast for rolling past 52 weeks and 65 future weeks.
  • Built daily build plan reports that get refreshed daily to track production targets, product backlog, and product lead-times.
  • Built a custom ETL using SQL scripts and PL/SQL stored procedures to periodically pull data from a relational database that satisfies certain criteria, modify/transform the data, and load into a staging area on the destination.
Technologies: Oracle, PL/SQL, SQL

Customer Master Data Migration

Oracle TCA (trading community architecture) was being used to manage enterprise customer master. TCA data model is tightly coupled with other Oracle e-business applications. There are quite a few validations and constraints, due to which the response time was long and became un-acceptable. It was decided to have a decoupled customer master.

I designed the new data model for the enterprise customer master data. Built APIs and processes to rapidly capture new customers, update customer attributes, and search and return an existing customer. These APIs ensured timely, accurate, and consistent delivery of customer information to other application screens (UI), transactions, and reporting processes. This enabled the organization to realize associated response time benefits in other APIs as well.

Since customer APIs are providing a <100 millisecond response times for 98% of the transactions. Enhancements and modifications to the enterprise customer master are being done at a faster pace with confidence since the impact is predictable.

Unified Customer Case Management System

Replaced numerous case management tools associated with each of the customer and partner-facing applications with a unified case management system based on the SalesForce.com platform. I collated and mapped attributes of various legacy case management tools and finalized the list of attributes. That list formed the basis for the new SalesForce.com case management system attributes.

I built new APIs to view details of a case, created a new case via email, and updated a case. Built a pipeline to extract case-related information periodically from SalesForce.com into Oracle and built a business intelligence dashboard that shows the metrics related to case volume, response time, age of cases, case hopping between support teams, and trends to compare tools that are demanding a lot of support.

This initiative helped the organization to present a unified look and feel of the case management system to a customer, which in turn led to brand building and improved customer satisfaction.

Phoenix - Migration to OTM (Oracle Territory Manager)

Companies that have a global market have teams deployed in strategic regions. Those teams build processes that suite their needs. Over time, these processes become inconsistent and contradictory, hindering the growth of the business. It was decided that teams across the globe would migrate to OTM (Oracle Territory Manager).

I mapped the definitions of sales territories and sales agents from the legacy system on to the data model of OTM. Identified gaps and built custom enhancements to OTM. The immediate benefit was a robust system which enabled consistent processes and timelines for all the teams across the globe. It would have taken a couple of quarters for changes to the sales hierarchy and/or sales territory to be completely absorbed across the globe.

Now, with the adoption of OTM, such changes can be scheduled to happen as per-notified date and time reflecting simultaneously across all the teams. This enabled consistent, accurate, and consolidated reporting of all the regions promptly. This also enabled the movement of employees between regions (no need for region-specific business process training).

ICE - Incentive Compensation Excellence

This is a project to standardize the yearly legal process of setting sales related targets and goals for sales teams in a consistent, repeatable, accurate, and time-bound manner. Additionally, it was important to keep track of sales made by individuals and compute their compensation accurately. To achieve this, Oracle incentive compensation was the chosen application.

I designed and built an enterprise data warehouse that fulfilled the need for operational reports and dashboards that would track key metrics. I built the data replication for these data objects in the data warehouse from the transaction data source. I processed millions of transaction lines daily, generated aggregations, and roll-up metrics, which enabled data visualization via Dashboards. The sales community relied on these reports heavily to track their actual sales vs. goal, as well as their compensation.

If these reports are not accurate, the timely production of the sales community will take a hit trying to get the reports corrected by raising internal support tickets since compensation is tied to them. The organization achieved more than 60% reduction in the volume of support tickets raised under this category.

Migrate Sales Crediting Application - Oracle Database Upgrade

This legacy sales crediting application used to take more than 20 hours a day to process the daily volume of transactions. Looking at the rate of growth of the sales transaction volume, it was predicted that the system would not be able to complete its daily processing in 24 hrs, and it becomes unsustainable.

I planned and executed the database application migration from Oracle database 9i to the 11g version. Even though initially it looked like a lift and shift, I had to review pieces of code and re-write them to take advantage of the cost-based optimizer. After rigorous tuning of the code, the time taken by daily processing to complete was drastically cut down to <3 hours. It was a tremendous gain for the business.

OneCat - Consolidation of Sales Credit Adjustment Tools

The organization was battling with seven different credit adjustment tools that were currently in use to support regional nuances. These tools were developed using various technologies, some simple, some complex. Each of those was managed by specific teams across the globe.

I stepped in to understand the business benefit that each of the tools provided and proposed a structure that was common to all the seven tools: Upload, validate, approve, process adjustment, and report in a consistent, transparent, repeatable manner.

I designed and built the data model and the database APIs that supported the OneCat tool. The organization quickly realized its goal of managing all the different types of adjustments in one location, enabling them to enforce the capturing of mandatory information and register approval of adjustments. Additionally, this enabled the decommissioning of seven legacy tools and saving operational and support costs.

Identify Duplicate Customer Names and Address

A customer master repository of an enterprise organization that kept evolving for over 20 years. Of late, users of the customer data have started noticing numerous variations of the same customer names and addresses and are annoyed. Over a period of time due to various initiatives to prevent the creation of duplicate customers, entries have reduced drastically. Even then, users continued to be annoyed with existing duplicates in the repository.
I designed a pipeline to move customer data from a relational database to a Hadoop hive warehouse using the sqoop script. Built a Python script to manipulate customer names and addresses to transform and generate attributes necessary for a machine learning model. Using a SciKit-Learn random forest library, I built a supervised binary classifier model to classify if the given pair of customer names and addresses are duplicate or not.
This helped the organization to manage the workload of data stewards and enabled them to be highly productive in improving the quality of the data. They could now focus on the high confidence potential duplicates identified by the machine learning model.

Languages

SQL, Python

Tools

Toad, PyCharm, Shell, Apache Sqoop, Bitbucket, Apache Solr, Tableau, Microsoft Power BI

Paradigms

Database Design, ETL

Storage

PL/SQL, Oracle SQL Developer, Apache Hive, Master Data Management (MDM), MongoDB, Neo4j

Other

Data Modeling, Data Migration, Database Schema Design, Relational Database Design, Data Mining, SSH, Star Schema, Shell Scripting, Modeling, Machine Learning

Frameworks

Hadoop

Libraries/APIs

Scikit-learn, PySpark, NumPy

Platforms

Oracle, Apache Kafka, Jupyter Notebook, Unix, Linux

1993 - 1995

Master of Science Degree in Electronics Science

Berhampur University - Berhampur, Odisha, India

1990 - 1993

Bachelor of Science Degree in Electronics

P.B.Siddhartha Arts and Science College - Vijayawada, Andhra Pradesh, India

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring