Priyank Singh, Software Developer in Toronto, Canada
Priyank Singh

Software Developer in Toronto, Canada

Member since December 15, 2020
Priyank is a Python database expert who pioneers experiences in database design, big data pipelines, data modeling, and real-time data ingestion. He migrated a legacy client to Azure SQL, increasing revenue by $10 million per year and streamlined the entire DW process of three countries to keep real-time tracking of financial transactions. Priyank excels with Databricks, data lake (EDL) integration, migrations, and cloud projects, enabling clients to realize complex data projects smoothly.
Priyank is now available for hire


    Microsoft Power BI, Azure, Databricks, SQL, EDL, Python, Spark, MariaDB, AWS...
  • Fusion Software Solution
    Spark, SAP Business Object Processing Framework (BOPF), Informatica ETL...
  • Accenture
    Apache Hive, Hadoop, Business Objectives, Datastage, Teradata, Big Data...



Toronto, Canada



Preferred Environment

Big Data, Hadoop, SQL, Microsoft Power BI, Tableau, Databricks, ETL, Azure, Python

The most amazing... pipelines I built migrated data from the mainframe to Oracle, enabling manual process automation in empowered analytics.


  • Senior Data Engineer

    2019 - PRESENT
    • Built a Python ETL process to migrate data from the mainframe to Oracle, which enabled the client to automate manual processes, eventually saving ten million dollars per year. KPI is prepared in Power BI.
    • Completed data integration of customer information from the mainframe, SQL sources, blob containers, and flat files into a Hadoop cluster and hive data lake using ETL jobs, transformed data, and ingested into Oracle.
    • Generated advanced Tableau dashboards with quick/context/global filters, parameters, and calculated fields that allowed to track and improve customer units' KPI by 12% within a month.
    • Deployed over ten analytics projects for all departments using Tableau 8.x version.
    • Created visualization reports with tools such as Tableau and Power BI.
    • Built data models around the financial, insurance, and transportation domains from disparate data sources.
    • Converted complex reports to work fluidly in Power BI. Built the universe or data model, creating all facts and dimensions tables in Power BI to prepare a 360-degree view of dashboards.
    • Automated social media data gathering using various APIs and scraping methods through Python.
    • Developed and maintained various surveys and social data transformation pipelines used to prepare data for Power BI reports, primarily using Pandas. Processed these data pipelines through Airflow.
    • Promoted SQL modeling tools, skilled in writing advanced SQLs, and have strong knowledge of data warehouses such as Teradata, Snowflake, BigQuery, and Redshift.
    Technologies: Microsoft Power BI, Azure, Databricks, SQL, EDL, Python, Spark, MariaDB, AWS, Tableau
  • Senior Software Engineer

    2015 - 2020
    Fusion Software Solution
    • Re-engineered systems to adapt to the GDPR for the European market. Developed a Teradata batch process to transform the staged data, load dimension, and facts tables. Created Unix/SQL/PL/SQL scripts to offload data back to the hive tables.
    • Designed an ETL flow in a Control-M airflow scheduler to trigger batch processes and Informatica jobs. Set up dependencies to prevent data deadlocks and created proper alerts to notify stakeholders in case of errors and warnings.
    • Created infrastructure and later on built templates for automating infrastructure deployment.
    • Built the P&L metrics, user dashboards for reporting the highest/least profitable customers, and dashboards with YTD revenue and cost metrics by the LOB. Performed data reconciliation between dashboard revenue numbers with reported revenue figures.
    • Built, maintained, and tuned Tableau and Power BI dashboards for a broad variety of internal clients.
    • Constructed Jenkins (DevOps) pipelines to trigger and deploy code in various environments.
    Technologies: Spark, SAP Business Object Processing Framework (BOPF), Informatica ETL, Tableau, Unix, Teradata, PL/SQL
  • Senior Software Engineer

    2014 - 2015
    • Gathered and defined business requirements while managing risks to improve business processes, thereby contributing to enterprise architecture development from business needs through business analysis and map processes.
    • Managed ETL (Teradata, Informatica, Datastage), SQL and database performance tuning, troubleshooting, support, and capacity estimation to ensure the highest data quality standards.
    • Developed Informatica ETL mappings, Teradata BTEQ, FastExport, FastLoad, MultiLoad, TPT scripts, Oracle PL/SQL scripts, Unix shell scripts, and optimized SQL queries/ETL mappings to efficiently handle huge volumes of data and complex transformations.
    Technologies: Apache Hive, Hadoop, Business Objectives, Datastage, Teradata, Big Data, Data Warehouse Design
  • Consultant

    2012 - 2014
    • Created dashboards in Power BI and Tableau. Built to capture a 360-degree view of customer information for a leading bank in Europe.
    • Designed and created data models and built a batch processes to populate those data models.
    • Worked on data manipulation using Power Query on top of the view to provide security and improved performance.
    Technologies: Azure, Apache Hive, Teradata, Microsoft Power BI, SQL, Spark
  • Software Engineer

    2010 - 2012
    Tata Consultancy Services
    • Built ETL processes in PostgreSQL to process a huge volume of data.
    • Created metadata tables to easily understand bottlenecks and built dashboards to highlight those bottlenecks.
    • Managed professional services and implemented general ledger reports on Power BI. Performed advanced calculations on the database by offloading some of the processes from Power BI to a database, which improved performance.
    Technologies: Spark, Big Data, Tableau, Python, Scala, Databricks


  • Hercules Funnel

    This project was designed to figure out the potential opportunities in the funnel and give 360-degree reporting based on sizing, estimation, and to determine which are closest to a win.

    I managed a team of four, analyzed business requirements, created user stories, and finalized sprint requirements with the project owner. I then created technical specifications, data models, ETL jobs, Control-M jobs, and supported pre-deployment and post-deployment validations.


    This data warehousing (DW) project was developed to streamline the entire process of three countries, Saudi Arabia, Egypt, and Nigeria, determine specific data, and keep real-time tracking of financial transactions. These transactions are tracked on a daily, weekly, and monthly basis. This is managed through Informatica Extract, Teradata, business objects, Load, and Transform process. GL, Sub Ledger, HFM (Hyperion Financial Management) were additional modules that were used.

    Created detailed level design documents and developed Teradata SQL/BTEQ, Oracle PL/SQL, FastExport and Mload scripts to process EDW data into multidimensional data extracts. Developed workflows and batch processes to transform that data and load into dimension/fact tables.

    Performed root-cause analysis, preemptive diagnosis to prevent any issues related to financial data consolidation and resolved data issues.


  • Languages

    SQL, Python, Scala, Snowflake, Python 3
  • Frameworks

    Hadoop, Spark
  • Tools

    Informatica ETL, Teradata SQL Assistant, Tableau, Microsoft Power BI, IBM InfoSphere (DataStage)
  • Paradigms

    ETL, Database Development
  • Platforms

    Windows, Azure, Unix, Oracle, Databricks
  • Storage

    Teradata, Teradata Databases, Datastage, Apache Hive, MySQL, Data Pipelines, PostgreSQL, PL/SQL Developer, SQL Server Integration Services (SSIS), Databases, PL/SQL, MariaDB
  • Other

    Cloud Architecture, EDL, Big Data, Business Objectives, Data Warehouse Design, Data Visualization, Reporting, Reports, Data Engineering, Data Analysis, Shell Scripting, Informatica, CSV File Processing, Tableau Server, SAP Business Object Processing Framework (BOPF), AWS


  • Microsoft Azure Architect Technologies Certified
  • Teradata 12 Certified Technical Specialist
  • Microsoft Certified Solution Developer
    JUNE 2012 - PRESENT

To view more profiles

Join Toptal
Share it with others