Seetha Venkatadri, Database Engineer and Developer in Menlo Park, CA, United States
Seetha Venkatadri

Database Engineer and Developer in Menlo Park, CA, United States

Member since March 3, 2017
Seetha is a SQL and ETL expert with over six years of industry experience dealing with data. She has successfully implemented and delivered several database projects for multiple clients—on time and exceeding expectations. Seetha completed her data engineering fellowship with Insight Data Science in 2018 and subsequently worked at an insurance tech startup as the sole data engineer setting up data warehousing and analytics platform for the team
Seetha is now available for hire



  • SQL 6 years
  • Shell Scripting 5 years
  • Linux 5 years
  • Data Modeling 2 years
  • Python 2 years
  • PostGIS 1 year
  • AWS 1 year
  • Apache Kafka 1 year


Menlo Park, CA, United States



Preferred Environment

Amazon Web Services (AWS), Linux, Oracle, GitHub, MySQL, AWS

The most amazing...

...thing I've done is learned to work with big data engineering tools in seven weeks and build a portfolio project at Insight Data Science.


  • Data Engineer

    2018 - 2019
    Go Car Insurance
    • Integrated data from different data sources such as AWS RDS, Zendesk, Facebook Analytics,.
    • Migrated from BigQuery to AWS Redshift.
    • Collaborated with a data analyst and a data scientist to provide quality data for explorations.
    • Set up Looker and created dashboards that provided insights and helped and encouraging other teams members to explore data on their own.
    Technologies: Zendesk API, PostGIS, PostgreSQL, Looker, Go, Python, Redshift
  • Data Engineering Fellow

    2018 - 2018
    Insight Data Science
    • Built a full stack application supported by a real-time streaming data pipeline that tracks flight status and provides weather updates in parallel to ensure safe flying conditions.
    • Effectively implemented Stream-Table joins using Kafka Streams and geospatial search on streaming data using PostGIS.
    • Learned cutting-edge technologies in a fast-paced environment getting constructive feedback from mentors and peers.
    Technologies: Kafka Streams, Flask, Python, PostgreSQL, Apache Kafka
  • Lead Database Developer

    2013 - 2016
    UBI Banca, Italy (via Accenture)
    • Led the database team handling multiple work requests contributing to the development and enhancements of six key bank applications: HR-Balance, Anagrafe-Prospect, IWB Migration, CRM Contact History, CRM In Action, and GeMo Migration.
    • Performed overall work flow management for the database team using Rational ClearQuest.
    • Suggested code review and version control tools to the client team to improve code quality.
    • Trained entry-level engineers as part of Accenture's development initiatives.
    Technologies: MySQL, Shell Scripting, Oracle PL/SQL
  • Senior Database Developer

    2012 - 2013
    Centrica, UK (via Cognizant Technology Solutions)
    • Worked on a project that involved a WMIS database upgrade; ported Pro*C, Perl, shell scripts, and SQL scripts from Solaris to Linux and Oracle 9i to Oracle 11g, respectively.
    • Led a team of engineers in identifying migration issues and debugging the defects.
    • Performed estimation, planning, and task allocation for the project.
    • Supported user acceptance testing.
    Technologies: Shell Scripting, Oracle PL/SQL
  • Database Developer

    2011 - 2012
    Avon, Brazil (via Cognizant Technology Solutions)
    • Worked on the project called ASLCP Pre-Rezoning which involved the implementation of pre-rezoning logic in PL/SQL from Java and implemented changes to the physical data model using Erwin.
    • Refreshed the ARRM hardware which involved the migration of scripts and unit testing; I delivered it with zero defects.
    Technologies: Linux, Oracle 10g
  • Programmer Analyst

    2011 - 2011
    Comcast, USA (via Cognizant Technology Solutions)
    • Worked on commercial workbench reports which involved converting FoxPro code to Oracle SQL queries for reporting order cancellations and pending orders.
    • Conducted query tuning using explanation plans.
    Technologies: Oracle 10g
  • Programmer Analyst Trainee

    2010 - 2011
    Emdeon Inc., USA (via Cognizant Technology Solutions)
    • Worked on advanced claiming and a real-time payment gateway by contributing to data loading, benefit building for 4010, Minnesota and 5010 payers, FER service, data enhancement service, and data loading.
    • Tuned data enhancement queries for optimal performance bringing down the response time from 20 minutes to seven seconds.
    • Prepared and tested deployment scripts for a testing environment.
    Technologies: Linux, Oracle 10g


  • SafeJourney | Real-time Data-streaming Pipeline (Development)

    I developed a full-stack application supported by a real-time streaming data pipeline that tracks flight status and provides weather updates in parallel to ensure safe flying conditions.

  • HR Balance for UBI Banca (Development)

    I developed the database and back-end logic for this employee assessment tool for the bank. I also developed the search engine for finding the best-fit employee for an open position based on their assessment history.

  • GeMo Migration for UBI Banca (Development)

    As the lead database developer, I was responsible for the estimation, implementation, and overall delivery of this critical and complex migration project involving the migration of stored procedures, functions, views, materialized views, shell scripts, and stand-alone application queries. I managed a team of four database developers—guiding them so that they understood the requirements and accomplished their tasks under strict deadlines.

  • Anagrafe Prospect for UBI Banca (Development)

    I implemented the prospect customer creation process (census). I also developed data-loading procedures, contact validation (Recapiti Validazione), and address normalization involving web service calls (UTL_HTTP), file handling, and XML extraction. I created extensive error handling and logging methods using exception handling and autonomous transactions for the entire application. This project also involved a geolocalization feature for the bank’s website, for which I developed a stored procedure invoking Google web services.

  • CRM Contact History (Development)

    I worked on creating database objects for the customer contact history management involving procedures, functions, and sequences. I created batch jobs to establish the communication between the contact history database and the data warehouse of the bank (InSight) and the transaction database (InAction)—using external tables, XML parsing, and MERGE statements.

  • IWBank Migration (Development)

    IWBank is one of the banks in the UBI banking group. I implemented a data replication solution using triggers and the Oracle Advanced Queuing where I created DML triggers, source queues, and queue tables along with the queue procedure.


  • Languages

    SQL, Python, C, Go
  • Tools

    MySQL Workbench, PuTTY, Toad, Erwin, Oracle SQL Data Modeler, Looker, Kafka Streams, Jira, GitHub, TortoiseSVN, Git, Stitch Data
  • Storage

    SQL Performance, SQL Developer, Oracle PL/SQL, MySQL, AWS S3, PostgreSQL, Oracle 10g, Oracle 11g, PostGIS, Oracle 12c, Redshift
  • Other

    Unix Shell Scripting, Shell Scripting, Data Modeling, AWS, Estimations
  • Platforms

    Linux, AWS EC2, Apache Kafka, Amazon Web Services (AWS), Oracle, Unix, Windows
  • Frameworks

  • Libraries/APIs

    Zendesk API


  • Bachelor of Technology degree in Information Technology
    2005 - 2009
    Anna University - Chennai, India


  • Deep Learning Specialization
  • Machine Learning
    MARCH 2018 - PRESENT
  • Oracle SQL Expert and PL/SQL Certified Developer

To view more profiles

Join Toptal
Share it with others