Priyank Singh, Data Warehouse Design Developer in Toronto, ON, Canada
Priyank Singh

Data Warehouse Design Developer in Toronto, ON, Canada

Member since January 30, 2017
Priyank is a database expert who pioneers experiences in database design, big data pipelines, and real-time big data ingestion. He migrated a legacy client to Azure SQL, increasing revenue by $10 million per year and streamlined the entire DW process of three countries to keep real-time tracking of financial transactions. Priyank excels with Databricks, Enterprise Datalake (EDL) integration, migrations, and cloud projects, enabling clients to realize complex data projects smoothly.
Priyank is now available for hire


  • Confidential
    Microsoft Power BI, Oracle, Unix, Azure, Databricks, SQL, EDL, Python, Spark
  • Fusion Software Solution
    Spark, SAP Business Object Processing Framework (BOPF), Informatica ETL...
  • Accenture
    Apache Hive, Hadoop, Business Objectives, Datastage, Teradata, Big Data...



Toronto, ON, Canada



Preferred Environment

Big Data, Hadoop, SQL, Microsoft Power BI, Tableau, Databricks, Teradata, ETL, Azure, Windows

The most amazing... pipelines I built migrated data from the mainframe to Oracle, which enabled automation of manual process in empowered analytics.


  • Senior Data Engineer

    2019 - PRESENT
    • Build an ETL process to migrate data from the mainframe to Oracle, which enabled the client to automate manual processes, eventually saving ten million dollars per year.
    • Completed data integration of customer information from the mainframe, SQL sources, blob containers, and flat files into a Hadoop cluster - hive data lake using ETL jobs, transformed data, and ingested into Oracle.
    • Analyzed the performance of Hadoop components – hive queries, ingestion scripts, and optimized data pipelines. Resolved production defects after the root cause analysis of issues and deploy code fixes in production.
    • Performed on-call production support (L3) for Hadoop and production deployment activities.
    • Served as the lead and performed requirement analysis for epics, data profiling, created user stories, and worked in sprint planning activities along with the client’s scrum master and product manager.
    • Created visualization reports with tools such as Tableau and Power BI.
    • Built data models around the financial, insurance, and transportation domains from disparate data sources.
    • Converted complex reports to work fluidly in Power BI.
    Technologies: Microsoft Power BI, Oracle, Unix, Azure, Databricks, SQL, EDL, Python, Spark
  • Senior Software Engineer

    2015 - 2020
    Fusion Software Solution
    • Re-engineered systems to adapt to the GDPR for the European market. Developed a Teradata batch process to transform the staged data, load dimension, and facts tables. Created Unix/SQL/PL/SQL scripts to offload data back to the hive tables.
    • Designed an ETL flow in a Control-M scheduler to trigger batch processes and informatica jobs. Set up dependencies to prevent data deadlocks and created proper alerts to notify stakeholders in case of errors and warnings.
    • Integrated Digital Insight's (acquired company ) data into the data warehouse of NCR. New values related to the GL and inventory would start flowing in EDW to build reports.
    • Created P/L metrics, user dashboards for reporting the highest/least profitable customers and dashboards with YTD revenue and cost metrics by the line of business. Performed data reconciliation between dashboard revenue numbers with reported revenue figures.
    • Built, maintained, and tuned Tableau and Power BI dashboards for a broad variety of internal clients.
    Technologies: Spark, SAP Business Object Processing Framework (BOPF), Informatica ETL, Tableau, Unix, Teradata, PL/SQL
  • Senior Software Engineer

    2014 - 2015
    • Gathered and defined business requirements while managing risks to improve business processes, thereby contributing to enterprise architecture development from business needs through business analysis and map processes.
    • Managed ETL (Teradata, Informatica, Datastage), SQL and database performance tuning, troubleshooting, support, and capacity estimation to ensure highest data quality standards.
    • Developed Informatica ETL mappings, Teradata bteq, fastexport, fastload, mload, TPT scripts, Oracle PL/SQL scripts, unix shell scripts and optimized sql queries/ETL mappings to efficiently handle huge volumes of data and complex transformations.
    Technologies: Apache Hive, Hadoop, Business Objectives, Datastage, Teradata, Big Data, Data Warehouse Design
  • Consultant

    2012 - 2014
    • Created dashboards in Power BI and Tableau. Build to capture 360 degree view of customer information for leading bank in Europe.
    • Designed and created data model and build batch process to populate those data models.
    • Data manipulation using Power Query on top of view to provide security and improved performance.
    Technologies: Azure, Apache Hive, Teradata, Microsoft Power BI, SQL, Spark
  • Software Engineer

    2010 - 2012
    Tata Consultancy Services
    • Build ETL processes in PostgreSQL to process a huge volume of data.
    • Created meta data tables for the easy understanding of bottlenecks and build dashboards to highlight those bottlenecks.
    • Managed professional services and implemented general ledger reports on Power BI. Performed advanced calculations on the database by offloading some of the process from power BI to a database, which improved performance.
    Technologies: Spark, Big Data, Tableau, Python, Scala, Databricks


  • Hercules Funnel

    This project was designed to figure out the potential opportunities in the funnel and give 360-degree reporting based on sizing, estimation, and to determine which are closest to a win.

    I managed a team of four, analyzed business requirements, created user stories, and finalized sprint requirements with the project owner. I then created technical specifications, data models, ETL jobs, Control-M jobs, and supported pre-deployment and post-deployment validations.


    This data warehousing (DW) project was developed to streamline the entire process of three countries, Saudi Arabia, Egypt, and Nigeria, determine specific data, and keep real-time tracking of financial transactions. These transactions are tracked on a daily, weekly, and monthly basis. This is managed through Informatica Extract, Teradata, business objects, Load, and Transform process. GL, Sub Ledger, HFM (Hyperion Financial Management) were additional modules that were used.

    Created detailed level design documents and developed Teradata SQL/BTEQ, Oracle PL/SQL, FastExport and Mload scripts to process EDW data into multidimensional data extracts. Developed workflows and batch processes to transform that data and load into dimension/fact tables.

    Performed root-cause analysis, preemptive diagnosis to prevent any issues related to financial data consolidation and resolved data issues.


  • Languages

    SQL, Python, Scala, Snowflake
  • Frameworks

    Hadoop, Spark
  • Tools

    Informatica ETL, Teradata SQL Assistant, Tableau, Microsoft Power BI, IBM InfoSphere (DataStage)
  • Paradigms

    ETL, Database Development
  • Platforms

    Windows, Azure, Unix, Oracle, Databricks
  • Storage

    Teradata, Teradata Databases, Datastage, Apache Hive, MySQL, Data Pipelines, PostgreSQL, PL/SQL Developer, SQL Server Integration Services (SSIS), Databases, PL/SQL
  • Other

    Cloud Architecture, EDL, Big Data, Business Objectives, Data Warehouse Design, Data Visualization, Reporting, Reports, Data Engineering, Data Analysis, Shell Scripting, Informatica, CSV File Processing, SAP Business Object Processing Framework (BOPF)


  • Microsoft Azure Architect Technologies Certified
  • Teradata 12 Certified Technical Specialist
  • Microsoft Certified Solution Developer
    JUNE 2012 - PRESENT

To view more profiles

Join Toptal
Share it with others