Seetha Venkatadri, Developer in Menlo Park, CA, United States
Seetha is available for hire
Hire Seetha

Seetha Venkatadri

Verified Expert  in Engineering

Database Engineer and Developer

Menlo Park, CA, United States
Toptal Member Since
June 18, 2020

Seetha is a SQL and ETL expert with over six years of industry experience dealing with data. She has successfully implemented and delivered several database projects for multiple clients—on time and exceeding expectations. Seetha completed her data engineering fellowship with Insight Data Science in 2018 and subsequently worked at an insurance tech startup as the sole data engineer setting up data warehousing and analytics platform for the team


Go Car Insurance
Zendesk API, PostGIS, PostgreSQL, Looker, Go, Python, Redshift
Insight Data Science
Kafka Streams, Flask, Python, PostgreSQL, Apache Kafka
UBI Banca, Italy (via Accenture)
MySQL, Shell Scripting, Oracle PL/SQL




Preferred Environment

Amazon Web Services (AWS), Linux, Oracle, GitHub, MySQL

The most amazing...

...thing I've done is learned to work with big data engineering tools in seven weeks and build a portfolio project at Insight Data Science.

Work Experience

Data Engineer

2018 - 2019
Go Car Insurance
  • Integrated data from different data sources such as AWS RDS, Zendesk, Facebook Analytics,.
  • Migrated from BigQuery to AWS Redshift.
  • Collaborated with a data analyst and a data scientist to provide quality data for explorations.
  • Set up Looker and created dashboards that provided insights and helped and encouraging other teams members to explore data on their own.
Technologies: Zendesk API, PostGIS, PostgreSQL, Looker, Go, Python, Redshift

Data Engineering Fellow

2018 - 2018
Insight Data Science
  • Built a full stack application supported by a real-time streaming data pipeline that tracks flight status and provides weather updates in parallel to ensure safe flying conditions.
  • Effectively implemented Stream-Table joins using Kafka Streams and geospatial search on streaming data using PostGIS.
  • Learned cutting-edge technologies in a fast-paced environment getting constructive feedback from mentors and peers.
Technologies: Kafka Streams, Flask, Python, PostgreSQL, Apache Kafka

Lead Database Developer

2013 - 2016
UBI Banca, Italy (via Accenture)
  • Led the database team handling multiple work requests contributing to the development and enhancements of six key bank applications: HR-Balance, Anagrafe-Prospect, IWB Migration, CRM Contact History, CRM In Action, and GeMo Migration.
  • Performed overall work flow management for the database team using Rational ClearQuest.
  • Suggested code review and version control tools to the client team to improve code quality.
  • Trained entry-level engineers as part of Accenture's development initiatives.
Technologies: MySQL, Shell Scripting, Oracle PL/SQL

Senior Database Developer

2012 - 2013
Centrica, UK (via Cognizant Technology Solutions)
  • Worked on a project that involved a WMIS database upgrade; ported Pro*C, Perl, shell scripts, and SQL scripts from Solaris to Linux and Oracle 9i to Oracle 11g, respectively.
  • Led a team of engineers in identifying migration issues and debugging the defects.
  • Performed estimation, planning, and task allocation for the project.
  • Supported user acceptance testing.
Technologies: Shell Scripting, Oracle PL/SQL

Database Developer

2011 - 2012
Avon, Brazil (via Cognizant Technology Solutions)
  • Worked on the project called ASLCP Pre-Rezoning which involved the implementation of pre-rezoning logic in PL/SQL from Java and implemented changes to the physical data model using Erwin.
  • Refreshed the ARRM hardware which involved the migration of scripts and unit testing; I delivered it with zero defects.
Technologies: Linux, Oracle 10g

Programmer Analyst

2011 - 2011
Comcast, USA (via Cognizant Technology Solutions)
  • Worked on commercial workbench reports which involved converting FoxPro code to Oracle SQL queries for reporting order cancellations and pending orders.
  • Conducted query tuning using explanation plans.
Technologies: Oracle 10g

Programmer Analyst Trainee

2010 - 2011
Emdeon Inc., USA (via Cognizant Technology Solutions)
  • Worked on advanced claiming and a real-time payment gateway by contributing to data loading, benefit building for 4010, Minnesota and 5010 payers, FER service, data enhancement service, and data loading.
  • Tuned data enhancement queries for optimal performance bringing down the response time from 20 minutes to seven seconds.
  • Prepared and tested deployment scripts for a testing environment.
Technologies: Linux, Oracle 10g

SafeJourney | Real-time Data-streaming Pipeline
I developed a full-stack application supported by a real-time streaming data pipeline that tracks flight status and provides weather updates in parallel to ensure safe flying conditions.

HR Balance for UBI Banca

I developed the database and back-end logic for this employee assessment tool for the bank. I also developed the search engine for finding the best-fit employee for an open position based on their assessment history.

GeMo Migration for UBI Banca

As the lead database developer, I was responsible for the estimation, implementation, and overall delivery of this critical and complex migration project involving the migration of stored procedures, functions, views, materialized views, shell scripts, and stand-alone application queries. I managed a team of four database developers—guiding them so that they understood the requirements and accomplished their tasks under strict deadlines.

Anagrafe Prospect for UBI Banca

I implemented the prospect customer creation process (census). I also developed data-loading procedures, contact validation (Recapiti Validazione), and address normalization involving web service calls (UTL_HTTP), file handling, and XML extraction. I created extensive error handling and logging methods using exception handling and autonomous transactions for the entire application. This project also involved a geolocalization feature for the bank’s website, for which I developed a stored procedure invoking Google web services.

CRM Contact History

I worked on creating database objects for the customer contact history management involving procedures, functions, and sequences. I created batch jobs to establish the communication between the contact history database and the data warehouse of the bank (InSight) and the transaction database (InAction)—using external tables, XML parsing, and MERGE statements.

IWBank Migration

IWBank is one of the banks in the UBI banking group. I implemented a data replication solution using triggers and the Oracle Advanced Queuing where I created DML triggers, source queues, and queue tables along with the queue procedure.
2005 - 2009

Bachelor of Technology Degree in Information Technology

Anna University - Chennai, India


Deep Learning Specialization



Machine Learning



Oracle SQL Expert and PL/SQL Certified Developer



Zendesk API


MySQL Workbench, PuTTY, Toad, Erwin, Oracle SQL Data Modeler, Looker, Kafka Streams, Jira, GitHub, TortoiseSVN, Git, Stitch Data


SQL, Python, C, Go


Linux, Amazon EC2, Apache Kafka, Amazon Web Services (AWS), Oracle, Unix, Windows


SQL Performance, Oracle PL/SQL, MySQL, Amazon S3 (AWS S3), PostgreSQL, Oracle 10g, Oracle 11g, PostGIS, Oracle 12c, Redshift




Unix Shell Scripting, Shell Scripting, Data Modeling, Estimations

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.


Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring