Priyank Singh, Developer in Toronto, Canada
Priyank is available for hire
Hire Priyank

Priyank Singh

Verified Expert  in Engineering

Software Developer

Location
Toronto, Canada
Toptal Member Since
December 15, 2020

Priyank is a Python database expert who pioneers experiences in database design, big data pipelines, data modeling, and real-time data ingestion. He migrated a legacy client to Azure SQL, increasing revenue by $10 million per year and streamlined the entire DW process of three countries to keep real-time tracking of financial transactions. Priyank excels with Databricks, data lake (EDL) integration, migrations, and cloud projects, enabling clients to realize complex data projects smoothly.

Portfolio

accessnowgroup.com
Microsoft Power BI, Azure, Databricks, SQL, EDL, Python, Spark, MariaDB...
Fusion Software Solution
Spark, SAP Business Object Processing Framework (BOPF), Informatica ETL...
Accenture
Apache Hive, Hadoop, Business Objectives, Datastage, Teradata, Big Data...

Experience

Availability

Part-time

Preferred Environment

Big Data, Hadoop, SQL, Microsoft Power BI, Tableau, Databricks, ETL, Azure, Python, Data Warehouse Design

The most amazing...

...data pipelines I built migrated data from the mainframe to Oracle, enabling manual process automation in empowered analytics.

Work Experience

Senior Data Engineer

2019 - PRESENT
accessnowgroup.com
  • Built a Python ETL process to migrate data from the mainframe to Oracle, which enabled the client to automate manual processes, eventually saving 10 million dollars per year. The KPI is prepared in Power BI.
  • Completed the data integration of customer information from the mainframe, SQL sources, blob containers, and flat files into a Hadoop cluster and hive data lake using ETL jobs and transformed data, and ingested into Oracle.
  • Generated advanced Domo dashboards with filters (quick, context, and global), parameters, and calculated fields to track and improve customer units' KPI by 12% within a month.
  • Deployed multiple analytics projects for all departments using Domo. With Domo, I implemented the solution to find out the engagement of different channels while running media campaigns on Facebook, Twitter, and others.
  • Created visualization reports with tools such as Domo, Tableau, Looker, and Power BI. Built data models around the financial, insurance, and transportation domains from disparate data sources.
  • Built and maintained dashboards that give a 360-degree overview of the customer journey and can build metrics that show PnL, general ledger, and more using Domo.
  • Converted complex reports to work fluidly in Domo and Power BI. Built the universe or data model, creating all facts and dimensions tables in Power BI to prepare a 360-degree view of dashboards.
  • Automated social media data gathering using various APIs and scraping methods through Python.
  • Developed and maintained various surveys and social data transformation pipelines to prepare data for Power BI reports, primarily using Pandas. Processed these data pipelines through Airflow.
  • Promoted SQL modeling tools and showed skill in writing advanced SQLs. Demonstrated strong knowledge in data warehouses such as Teradata, Snowflake, BigQuery, and Redshift.
Technologies: Microsoft Power BI, Azure, Databricks, SQL, EDL, Python, Spark, MariaDB, Tableau, Data Warehouse Design, ETL, Terraform, Amazon Web Services (AWS), CI/CD Pipelines, Apache Spark, Data Engineering, SAP, Microsoft Data Flows, Data Warehousing, Event-driven Architecture, Query Optimization, Data Wrangling, Azure Databricks, Salesforce, Business Intelligence (BI), Finance, Domo

Senior Software Engineer

2015 - 2020
Fusion Software Solution
  • Re-engineered systems to adapt to the GDPR for the European market. Developed a Teradata batch process to transform the staged data, load dimension, and facts tables. Created Unix/SQL/PL/SQL scripts to offload data back to the hive tables.
  • Designed an ETL flow in a Control-M airflow scheduler to trigger batch processes and Informatica jobs. Set up dependencies to prevent data deadlocks and created proper alerts to notify stakeholders in case of errors and warnings.
  • Created infrastructure and later on built templates for automating infrastructure deployment.
  • Built the P&L metrics, user dashboards for reporting the highest/least profitable customers, and dashboards with YTD revenue and cost metrics by the LOB. Performed data reconciliation between dashboard revenue numbers with reported revenue figures.
  • Built, maintained, and tuned Tableau and Power BI dashboards for a broad variety of internal clients.
  • Constructed Jenkins (DevOps) pipelines to trigger and deploy code in various environments.
Technologies: Spark, SAP Business Object Processing Framework (BOPF), Informatica ETL, Tableau, Unix, Teradata, PL/SQL, Data Warehouse Design, Azure, ETL, Databricks, Microsoft Power BI, SQL, Python, Terraform, Amazon Web Services (AWS), CI/CD Pipelines, Apache Spark, Data Engineering, SAP, Microsoft Data Flows, Data Warehousing, Event-driven Architecture, Query Optimization, Data Wrangling, Azure Databricks, Salesforce, Business Intelligence (BI), Finance

Senior Software Engineer

2014 - 2015
Accenture
  • Gathered and defined business requirements while managing risks to improve business processes, thereby contributing to enterprise architecture development from business needs through business analysis and map processes.
  • Managed ETL (Teradata, Informatica, Datastage), SQL and database performance tuning, troubleshooting, support, and capacity estimation to ensure the highest data quality standards.
  • Developed Informatica ETL mappings, Teradata BTEQ, FastExport, FastLoad, MultiLoad, TPT scripts, Oracle PL/SQL scripts, Unix shell scripts, and optimized SQL queries/ETL mappings to efficiently handle huge volumes of data and complex transformations.
Technologies: Apache Hive, Hadoop, Business Objectives, Datastage, Teradata, Big Data, Data Warehouse Design, Azure, ETL, Databricks, Microsoft Power BI, SQL, Python, Terraform, Amazon Web Services (AWS), CI/CD Pipelines, Apache Spark, Data Engineering, SAP, Microsoft Data Flows, Data Warehousing, Event-driven Architecture, Query Optimization, Data Wrangling, Azure Databricks, Salesforce, Business Intelligence (BI), Finance

Consultant

2012 - 2014
Capgemini
  • Created dashboards in Power BI and Tableau. Built to capture a 360-degree view of customer information for a leading bank in Europe.
  • Designed and created data models and built a batch processes to populate those data models.
  • Worked on data manipulation using Power Query on top of the view to provide security and improved performance.
Technologies: Azure, Apache Hive, Teradata, Microsoft Power BI, SQL, Spark, Data Warehouse Design, ETL, Databricks, Python, Terraform, Amazon Web Services (AWS), CI/CD Pipelines, Apache Spark, Data Engineering, SAP, Microsoft Data Flows, Data Warehousing, Event-driven Architecture, Query Optimization, Data Wrangling, Azure Databricks, Salesforce, Business Intelligence (BI), Finance

Software Engineer

2010 - 2012
Tata Consultancy Services
  • Built ETL processes in PostgreSQL to process a huge volume of data.
  • Created metadata tables to easily understand bottlenecks and built dashboards to highlight those bottlenecks.
  • Managed professional services and implemented general ledger reports on Power BI. Performed advanced calculations on the database by offloading some of the processes from Power BI to a database, which improved performance.
Technologies: Spark, Big Data, Tableau, Python, Scala, Databricks, Data Warehouse Design, Azure, ETL, Microsoft Power BI, SQL, Terraform, Amazon Web Services (AWS), CI/CD Pipelines, Apache Spark, Data Engineering, SAP, Microsoft Data Flows, Data Warehousing, Event-driven Architecture, Query Optimization, Data Wrangling, Azure Databricks, Salesforce, Business Intelligence (BI), Finance

Hercules Funnel

This project was designed to figure out the potential opportunities in the funnel and give 360-degree reporting based on sizing, estimation, and to determine which are closest to a win.

I managed a team of four, analyzed business requirements, created user stories, and finalized sprint requirements with the project owner. I then created technical specifications, data models, ETL jobs, Control-M jobs, and supported pre-deployment and post-deployment validations.

SIOP MEA WAVE 2

This data warehousing (DW) project was developed to streamline the entire process of three countries, Saudi Arabia, Egypt, and Nigeria, determine specific data, and keep real-time tracking of financial transactions. These transactions are tracked on a daily, weekly, and monthly basis. This is managed through Informatica Extract, Teradata, business objects, Load, and Transform process. GL, Sub Ledger, HFM (Hyperion Financial Management) were additional modules that were used.

Created detailed level design documents and developed Teradata SQL/BTEQ, Oracle PL/SQL, FastExport and Mload scripts to process EDW data into multidimensional data extracts. Developed workflows and batch processes to transform that data and load into dimension/fact tables.

Performed root-cause analysis, preemptive diagnosis to prevent any issues related to financial data consolidation and resolved data issues.

Languages

SQL, Python, Scala, Snowflake, Python 3

Frameworks

Spark, Hadoop, Apache Spark, Windows PowerShell

Libraries/APIs

PySpark

Tools

Informatica ETL, Teradata SQL Assistant, Tableau, Microsoft Power BI, IBM InfoSphere (DataStage), Terraform, Domo

Paradigms

ETL, Database Development, Event-driven Architecture, Business Intelligence (BI), DevOps

Platforms

Windows, Azure, Databricks, Unix, Oracle, Amazon Web Services (AWS), Docker, Apache Kafka, Google Cloud Platform (GCP), Salesforce

Storage

Teradata, Teradata Databases, Datastage, Apache Hive, MySQL, Data Pipelines, PostgreSQL, PL/SQL Developer, SQL Server Integration Services (SSIS), Databases, Azure SQL, Microsoft SQL Server, Oracle Cloud, PL/SQL, MariaDB

Other

Cloud Architecture, EDL, SAP Business Object Processing Framework (BOPF), Big Data, Business Objectives, Data Warehouse Design, Data Visualization, Reporting, Reports, Data Engineering, Data Analysis, Shell Scripting, Informatica, CSV File Processing, Tableau Server, Azure Databricks, Architecture, Big Data Architecture, Data Architecture, Roadmaps, Data Migration, Database Optimization, SAP, Microsoft Data Flows, Data Warehousing, Query Optimization, Data Wrangling, Finance, CI/CD Pipelines

OCTOBER 2020 - PRESENT

Microsoft Azure Architect Technologies Certified

Microsoft

SEPTEMBER 2012 - PRESENT

Teradata 12 Certified Technical Specialist

Teradata

JUNE 2012 - PRESENT

Microsoft Certified Solution Developer

IBM

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring