Ty Nijjar, Developer in Naperville, IL, United States
Ty is available for hire
Hire Ty

Ty Nijjar

Verified Expert  in Engineering

Azure Data Engineer and Developer

Naperville, IL, United States

Toptal member since June 18, 2020

Bio

Ty is a seasoned IT professional with over 20 years of experience implementing data warehousing and business intelligence solutions around the globe as a data engineer and data architect. His expertise includes Azure Data Factory, Azure Databricks, Azure Synapse, ETL, and data warehousing.

Portfolio

Rabobank
Azure, Databricks, Azure Data Lake, Azure Data Factory (ADF), DataFrames...
United Healthcare
Azure Synapse, Azure Data Lake, GitHub, Python, Cloud, Data Engineering...
BP
Azure Databricks, Azure Synapse, Azure Data Lake, PySpark, Python, Cloud...

Experience

  • SQL - 20 years
  • Agile - 10 years
  • SQL Server Integration Services (SSIS) - 8 years
  • Azure Data Lake - 5 years
  • Netezza - 5 years
  • Azure Data Factory (ADF) - 5 years
  • Azure SQL - 5 years
  • Unix - 2 years

Availability

Part-time

Preferred Environment

Data Warehouse Design, Data Warehousing, ETL, Netezza, Azure, Azure Data Factory (ADF), Azure Synapse, Azure Databricks, Azure Data Lake, Azure SQL, Unix, Data Analysis

The most amazing...

...solution I've created was a healthcare process to take data from multiple EMR systems and load it into a standard-designed EDW used in hospital systems.

Work Experience

Azure Data Architect Engineer

2024 - 2024
Rabobank
  • Designed a new process to implement a new medallion architecture data design pattern (bronze to silver to gold) for ingesting data so real-time data can be loaded using Azure Databricks.
  • Used both SQL and Python modules within Azure Databricks to extract data from API systems and load them into Delta and Delta Live Tables (DLT).
  • Designed a new dev, test, and prod environment, as DevOps had not been implemented in the current environment. Documented how to implement the solution using Azure DevOps.
Technologies: Azure, Databricks, Azure Data Lake, Azure Data Factory (ADF), DataFrames, Azure SQL Databases

Azure Data Engineer

2023 - 2024
United Healthcare
  • Built pipelines using Azure Synapse Analytics to load data into multiple databases.
  • Was involved in implementing a medallion architecture data design pattern (bronze to silver to gold).
  • Created a new validation process to automatically confirm that the load process worked correctly on a daily basis.
  • Designed a new GitHub architecture to keep track of code changes between dev, test, and prod environments.
Technologies: Azure Synapse, Azure Data Lake, GitHub, Python, Cloud, Data Engineering, Data Pipelines, Azure, Architecture, Databases, ETL, Snowflake

Azure Data Engineer

2021 - 2023
BP
  • Created notebooks using PySparkSQL and DataFrames that take data from a Databricks database with raw data, transform them, and put them into a clean Databricks database.
  • Developed Azure Data Factory (ADF) pipelines and DBT models to move data from Databricks tables into internal tables in Azure Synapse and the Snowflake database.
  • Used ADF pipelines to control and schedule the ETL process flow.
  • Contributed to the migration process, moving code from an old cloud architecture to a new standardized ETL architecture.
Technologies: Azure Databricks, Azure Synapse, Azure Data Lake, PySpark, Python, Cloud, Data Engineering, Data Pipelines, Azure, Architecture, Databases, ETL

Azure Data Engineer

2020 - 2021
BCBS Kansas City
  • Designed an ETL/ELT solution to move data from Azure Data Lake to an Azure SQL data warehouse.
  • Created a dynamic process to load data from multiple files into a staging area in Azure SQL using operations tables and a few dynamic Azure Data Factory pipelines.
  • Developed SQL view and stored procedure components that will be called from an Azure Data Factory pipeline to upsert records into the data model.
  • Handled all code tasks developed by the offshore development team.
Technologies: Azure Data Lake, Azure SQL, Azure Data Factory (ADF), Cloud, Data Engineering, Data Pipelines, Azure, Architecture, Databases, ETL

Data Engineer

2019 - 2020
Ensono
  • Built solutions using Azure functions, Python, Azure Data Factory, Azure Synapse Analytics, and Power BI.
  • Created source to target mappings and architecture documents for multiple projects.
  • Created a cost estimating model for a big data modern architecture.
  • Created migration plans for moving VMs from on-premise to the cloud.
Technologies: Azure, Azure Data Factory (ADF), T-SQL (Transact-SQL), Azure Data Lake, SQL Stored Procedures, Azure Databricks, ETL Development, ETL Implementation & Design, ETL Testing, Data Engineering, Data Pipelines, Architecture, Databases

ETL Consultant

2019 - 2019
Tier 1
  • Created a source to target mappings to load data from multiple sources to a new SQL Server database.
  • Created the operational framework used to track on-load processes.
  • Used SSIS as the ETL tool to load data into the data warehouse.
  • Set up SQL Agents to automatically run the ETL process.
Technologies: SQL, Azure Analysis Services, Azure DevOps, Azure Data Factory (ADF), ETL Development, ETL Implementation & Design, ETL Testing, Data Engineering, Data Pipelines, Databases, ETL

Azure Data Architect

2018 - 2019
Barilla
  • Created a new enterprise data warehouse solution using Azure SQL Server as the database.
  • Set up Integration run-time environment to load data from on-premise sources to the Azure cloud environment.
  • Used ADF to load data into Azure blob storage to be used by data scientists.
  • Used Azure Data Factory to load data from SAP BW source system to the new data warehouse.
Technologies: SAP Business Warehouse (BW), Azure Analysis Services, Azure SQL, Azure Data Factory (ADF), SQL Stored Procedures, ETL Development, ETL Implementation & Design, ETL Testing, Data Pipelines, Architecture, Databases, ETL

Senior Data Architect Consultant

2008 - 2018
Perficient Inc
  • Created a completely new ETL Architecture to load data from multiple medical record systems (Epic, Quadromed, and Envision) into the IBM Atomic Warehouse Model called the Gateway.
  • Took the gateway architecture and ported over to the Azure cloud platform using Azure Data Factory and stored procedures.
  • Led teams of five to ten resources in implementing the above solutions.
  • Create ETL Architecture documents for multiple projects.
  • Developed 100's of SSIS packages to load data warehouses in SQL Server.
  • Developed SSAS cubes as well as reports using SSRS.
Technologies: User Stories, Healthcare, SQL, Subversion (SVN), Shell Scripting, Service, Analysis, Microsoft SQL Server, SQL Server Integration Services (SSIS), Netezza, Datastage, SQL Stored Procedures, Azure Databricks, Azure Synapse, ETL Development, ETL Implementation & Design, ETL Testing, Architecture, Data Modeling, Databases, ETL

Experience

HCL

Azure Data Factory, Azure SQL, Azure Analysis Service

Education

1993 - 1998

Bachelor of Science Degree in Computer Engineering

University of Illinois - Chicago, IL

Skills

Libraries/APIs

PySpark

Tools

IBM InfoSphere (DataStage), IBM Information Management System, Subversion (SVN), SSAS, Microsoft Power BI, GitHub

Languages

SQL, T-SQL (Transact-SQL), Perl, Python, Snowflake

Paradigms

ETL, ETL Implementation & Design, Agile, Azure DevOps

Platforms

Azure, Azure Synapse, Unix, Databricks

Storage

Microsoft SQL Server, Azure SQL, SQL Stored Procedures, Data Pipelines, Databases, SQL Server Integration Services (SSIS), Netezza, SQL Server 2016, Data Lakes, Azure SQL Databases, Datastage

Industry Expertise

Healthcare

Other

Data Warehousing, Data Warehouse Design, Azure Data Factory (ADF), Azure Data Lake, Data Engineering, ETL Development, ETL Testing, Unix Shell Scripting, Shell Scripting, Data Analysis, Data Modeling, Azure Analysis Services, SAP Business Warehouse (BW), Analysis, Service, User Stories, Cloud, Architecture, Azure Databricks, DataFrames

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring