Syed Akbar Naqvi, Data Analysis Developer in Patna, Bihar, India
Syed Akbar Naqvi

Data Analysis Developer in Patna, Bihar, India

Member since June 18, 2020
Syed has more than 15 years of experience working in IT for the banking, insurance, retail, and telecom industries. He has designed and developed solutions for a high-performing multi-terabyte database on different platforms including Oracle, AWS, Google Analytics for marketing, IBM Silverpop, and more. Syed is always looking for exciting challenges to work on.
Syed is now available for hire




Patna, Bihar, India



Preferred Environment

Amazon Web Services (AWS), Toad, Windows, Amazon EC2, Linux, Unix

The most amazing...

...thing I've done was a real-time DWH, labor scheduler which involves multiple DBs and environments—the code interacts with data from different sources.


  • Senior Technical Architect

    2011 - 2019
    Nexgen Technology Services Pvt Ltd
    • Designed and developed ETL customization to US retailers for their retail merchandising system for their daily business analytics on Oracle database using ORDM, OWB, PLSQL, and Oracle Scheduler. Coded the complex business logic in SQL and PLSQL.
    • Designed and developed the ETL for an AWS-cloud-based DWH using AWS Redshift. Integrated the data from multiple sources into one source of truth like Flat files on SFTP, AWS S3, Google Analytics extracts, and IBM Silverpop.
    • Rigorously used open-source technologies like Python, TOS DI, SOS scheduler, and others to minimize the cost of operations.
    • Created well-maintained end-to-end architecture for the data flow from different sources that can execute independently without or with minimal user interaction.
    • Performed the day-to-day maintenance and recommendation tasks on multiple platforms, including Unix, Linux Windows, AWS, Redshift, Oracle, and other database administration activities.
    • Implemented performance tuning of queries and code as and when required.
    • Led the team to sort out the issues on all technical aspects of the database and ETL-related tasks.
    • Designed and developed the data model for a large project related to the labor management in retail.
    Technologies: Amazon Web Services (AWS), ETL, SOS Berlin Scheduler, Database Administration (DBA), Business Intelligence (BI), Data Warehouse Design, Data Warehousing, Data Modeling, Talend, Python, PostgreSQL, Redshift, Oracle SQL, Data Engineering, Oracle PL/SQL, SQL, PL/pgSQL, Amazon S3 (AWS S3), Technical Architecture, ETL Implementation & Design
  • Technical Writter

    2016 - 2018
    IAmOnDemand (via Toptal)
    • Worked on approximately 15 articles for technology people like, CIO, database administrators, developers, cloud architects, and so on.
    • Wrote several excellent articles (5-10 pages long); including the table of contents and whatnot. All of my articles are published and read by thousands of people. Some topics on which I wrote the articles are, Cloud Skills Set, AWS Redshift, RDS vs On-prem DBaaS, Aurora vs RDS to name some.
    • Fact-checked the article content and ensured it was not plagiarized.
    Technologies: Amazon Web Services (AWS), DevOps, Database as a Service (DBaaS), Redshift, Cloud
  • Senior Consultant

    2005 - 2009
    Capgemini Consulting India Pvt Ltd
    • Supported a large Java development team of 50 or more people with writing Oracle database queries—creating views, procedures and functions. Worked as part of the core database team to deliver different use-cases.
    • Created hundreds of Oracle procedures and packages for all DML operation for one of the top clients in PSU sector in Netherlands. This was done using dynamic SQL to speedup the development.
    • Designed and worked on the deliverables for one of the top PSU sector company in Netherlands which later resulted in bigger better monitory gain to the organization.
    • Worked as the only DBA to support all the instances of the Oracle databases used by the projects. Tasks involved setting up the database, loading data, and tuning the performance for the development team.
    • Worked on site with a client in the Netherlands for requirement gathering and deployment of projects.
    • Worked as a moderator to deliver a complex and challenging project related to a near real-time DWH. This involved the use of an Oracle stream and programming to load data from the OLTP environment to OLAP environment without putting any load on the source system.
    Technologies: Shell Scripting, Database Administration (DBA), Erwin, Oracle Streams, Toad, Linux, AIX, PL/SQL, Oracle SQL


  • RMS Connector

    RMS Connector works as an interface between a retail merchandising system (RMS) and Oracle retail data warehouse (ORDM).

    It easily integrates the RMS data feeds into ORDM for all levels of sales and inventory reporting and is used by the top retailers in the US and Central America.

    Work Done:
    • Built all the ETL and data flow from Flat files to the Oracle database using the Oracle Warehouse Builder.
    • Developed PLSQL packages and procedures for all new files from RMS to ORDM.
    • Made recommendations and was involved in the planning and setup of the database architecture for production.
    • Tuned the database and report performance.
    • Set up ETL automation using Oracle Scheduler Chains.

  • Customer Segmentation

    This project is a SaaS application running on AWS.

    The architecture includes:
    • Redshift: for the storage and reporting of data
    • Python: for data processing based on user input
    • SOS Belin Scheduler: a scheduler for executing Python scripts based on user inputs asynchronously
    • UI: for user input
    • Tableau: for reporting on segments created
    • Environment: EC2, Redshift

    My job was to design and develop the end-to-end flow of data and develop the required APIs for data processing.

    1) The user creates a segment model by selecting different KPIs related to the customer and then submits the job.
    2) The submits it by making a call to SOS REST and the Python library for Customer Segmentation.
    3) Examines the input provided in the REST call and based on that makes the next decision.
    4) The data is then processed and is ready to be picked up by Tableau.

    An initial data load is required for customer transactions and KPI preparation.

  • Store Operations

    This is a mobile-based application for near real-time view to store the conversion in terms of sales, traffic, conversion, and associate contributions which are crucial for any store manager.

    Work Done:
    • Developed the data model for layering including the dimensions, facts, and aggregates.
    • Built ETL procedures using Talend, PLINK, Python, SOS and Berlin Scheduler.
    • Wrote shell scripts to manage the data feeds and they used Python scripts to process the files from S3 to a Redshift database.

  • Labor Scheduler

    Labor Scheduler helps the store manager to connect predicted traffic demand with staff productivity and availability to optimize conversion.

    Work Done:
    • Create the end-to-end data model.
    • Developed the procedures and APIs for the data operation from the REST APIs.
    • Implemented version control in the DB rows.
    • Setup AWS RDS PostgreSQL for keeping costs within parameters and getting high throughput.
    • Set up data interactions between multiple databases using PostgreSQL database links to a Redshift database for extracting analytical data.
    • Rigorously used PL/pgSQL, Python, PostgreSQL, and Redshift for managing data.

  • IX Marketing

    The IX Marketing module helps retailers to understand and make the right decision at the right time to increase sales with customers online and in store.

    Work Done:
    • Created the data model and set up the environment using the Redshift database.
    • Extracted and loaded data from multiple sources like SFTP, S3, Google Analytics API, and IBM Silverpop.
    • Created Python scripts to automate the data loading.

  • Ministrie Van Defencie

    MOD is the biggest employer in the Netherlands with nearly 90,000 employees all over the world.

    Work Done:
    • Implemented new design recommended during POC.
    • Installed Oracle streams between Oracle 10g and Oracle 9i databases for real-time extractions.
    • Developed the new logic for ETL processes for near real-time transformation in ODM.
    • Set up batch jobs for the periodical load of transformed data into CDM from ODM for Cognos reporting.
    • Configured selected PeopleSoft HRMS tables for Streams configuration.
    • Performance-tuned their current system.

  • Eneco Energies

    Eneco Energies is one of the largest energy companies in the Netherlands.

    Work Done:
    • Successfully designed and implemented an MVS system into the SOA enabled architecture.
    • Tuned the physical model for performance improvement in ETL processes.
    • Successfully segregated all the objects pertaining to one functional area into separate database amounting 300 GB out of 2 TB.
    • Worked on data modeling, physical design, and database administration.
    • Performance-tuned the largest tables with up to 150 partitions, amounting to 400 GB alone.
    • Developed a mechanism to automate the setup of a testing environment.

  • Policy Administration System of the Netherands

    The policy administration system interfaces primarily with the tax and administration department and is integral to the reintegration of work processes and employee tax/income information processing.

    Role: Database Team Member | DBA
    Work Done:
    • Worked on all activities of an Oracle DBA and developer ranging from the logical and physical design, administration, PL/SQL, scripting, to communication with the front office about the CRs and use cases.
    • Created and modified the DSS and OLTP physical data model.
    • Performed database administration—sizing, backup recovery strategy planning, and implementation.
    • Database design and administration.
    • Database maintenance and release activities.
    • Performance-tuned.


  • Languages

    SQL, PL/pgSQL, Python
  • Paradigms

    ETL Implementation & Design, Business Intelligence (BI), ETL, DevOps
  • Storage

    PostgreSQL, Oracle PL/SQL, Redshift, Oracle Rdb, PL/SQL, Amazon S3 (AWS S3), Oracle SQL, Database Administration (DBA)
  • Other

    Technical Architecture, Data Analysis, Writing & Editing, Data Modeling, Data Engineering, Performance Tuning, Oracle Streams, Shell Scripting, Data Warehousing, Database as a Service (DBaaS), Data Warehouse Design, Data Build Tool (dbt)
  • Libraries/APIs

  • Tools

    AWS Deployment, SOS Berlin Scheduler, Toad, Erwin, Talend ETL, Apache Airflow
  • Platforms

    Linux, Amazon Web Services (AWS), Unix, Amazon EC2, Windows, Talend, AIX, Databricks


  • Master's Degree in Computer Science
    2005 - 2007
    Vinayaka Missions University - Patna, India


  • 1Z0-052 Oracle Database 11g Admin - 1
    Oracle University

To view more profiles

Join Toptal
Share it with others