
Ty Nijjar
Verified Expert in Engineering
Azure Data Engineer and Developer
Naperville, IL, United States
Toptal member since June 18, 2020
Ty is a seasoned IT professional with over 20 years of experience implementing data warehousing and business intelligence solutions around the globe as a data engineer and data architect. His expertise includes Azure Data Factory, Azure Databricks, Azure Synapse, ETL, and data warehousing.
Portfolio
Experience
- SQL - 20 years
- Agile - 10 years
- SQL Server Integration Services (SSIS) - 8 years
- Azure Data Lake - 5 years
- Netezza - 5 years
- Azure Data Factory (ADF) - 5 years
- Azure SQL - 5 years
- Unix - 2 years
Availability
Preferred Environment
Data Warehouse Design, Data Warehousing, ETL, Netezza, Azure, Azure Data Factory (ADF), Azure Synapse, Azure Databricks, Azure Data Lake, Azure SQL, Unix, Data Analysis
The most amazing...
...solution I've created was a healthcare process to take data from multiple EMR systems and load it into a standard-designed EDW used in hospital systems.
Work Experience
Azure Data Architect Engineer
Rabobank
- Designed a new process to implement a new medallion architecture data design pattern (bronze to silver to gold) for ingesting data so real-time data can be loaded using Azure Databricks.
- Used both SQL and Python modules within Azure Databricks to extract data from API systems and load them into Delta and Delta Live Tables (DLT).
- Designed a new dev, test, and prod environment, as DevOps had not been implemented in the current environment. Documented how to implement the solution using Azure DevOps.
Azure Data Engineer
United Healthcare
- Built pipelines using Azure Synapse Analytics to load data into multiple databases.
- Was involved in implementing a medallion architecture data design pattern (bronze to silver to gold).
- Created a new validation process to automatically confirm that the load process worked correctly on a daily basis.
- Designed a new GitHub architecture to keep track of code changes between dev, test, and prod environments.
Azure Data Engineer
BP
- Created notebooks using PySparkSQL and DataFrames that take data from a Databricks database with raw data, transform them, and put them into a clean Databricks database.
- Developed Azure Data Factory (ADF) pipelines and DBT models to move data from Databricks tables into internal tables in Azure Synapse and the Snowflake database.
- Used ADF pipelines to control and schedule the ETL process flow.
- Contributed to the migration process, moving code from an old cloud architecture to a new standardized ETL architecture.
Azure Data Engineer
BCBS Kansas City
- Designed an ETL/ELT solution to move data from Azure Data Lake to an Azure SQL data warehouse.
- Created a dynamic process to load data from multiple files into a staging area in Azure SQL using operations tables and a few dynamic Azure Data Factory pipelines.
- Developed SQL view and stored procedure components that will be called from an Azure Data Factory pipeline to upsert records into the data model.
- Handled all code tasks developed by the offshore development team.
Data Engineer
Ensono
- Built solutions using Azure functions, Python, Azure Data Factory, Azure Synapse Analytics, and Power BI.
- Created source to target mappings and architecture documents for multiple projects.
- Created a cost estimating model for a big data modern architecture.
- Created migration plans for moving VMs from on-premise to the cloud.
ETL Consultant
Tier 1
- Created a source to target mappings to load data from multiple sources to a new SQL Server database.
- Created the operational framework used to track on-load processes.
- Used SSIS as the ETL tool to load data into the data warehouse.
- Set up SQL Agents to automatically run the ETL process.
Azure Data Architect
Barilla
- Created a new enterprise data warehouse solution using Azure SQL Server as the database.
- Set up Integration run-time environment to load data from on-premise sources to the Azure cloud environment.
- Used ADF to load data into Azure blob storage to be used by data scientists.
- Used Azure Data Factory to load data from SAP BW source system to the new data warehouse.
Senior Data Architect Consultant
Perficient Inc
- Created a completely new ETL Architecture to load data from multiple medical record systems (Epic, Quadromed, and Envision) into the IBM Atomic Warehouse Model called the Gateway.
- Took the gateway architecture and ported over to the Azure cloud platform using Azure Data Factory and stored procedures.
- Led teams of five to ten resources in implementing the above solutions.
- Create ETL Architecture documents for multiple projects.
- Developed 100's of SSIS packages to load data warehouses in SQL Server.
- Developed SSAS cubes as well as reports using SSRS.
Experience
HCL
Education
Bachelor of Science Degree in Computer Engineering
University of Illinois - Chicago, IL
Skills
Libraries/APIs
PySpark
Tools
IBM InfoSphere (DataStage), IBM Information Management System, Subversion (SVN), SSAS, Microsoft Power BI, GitHub
Languages
SQL, T-SQL (Transact-SQL), Perl, Python, Snowflake
Paradigms
ETL, ETL Implementation & Design, Agile, Azure DevOps
Platforms
Azure, Azure Synapse, Unix, Databricks
Storage
Microsoft SQL Server, Azure SQL, SQL Stored Procedures, Data Pipelines, Databases, SQL Server Integration Services (SSIS), Netezza, SQL Server 2016, Data Lakes, Azure SQL Databases, Datastage
Industry Expertise
Healthcare
Other
Data Warehousing, Data Warehouse Design, Azure Data Factory (ADF), Azure Data Lake, Data Engineering, ETL Development, ETL Testing, Unix Shell Scripting, Shell Scripting, Data Analysis, Data Modeling, Azure Analysis Services, SAP Business Warehouse (BW), Analysis, Service, User Stories, Cloud, Architecture, Azure Databricks, DataFrames
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.
Start hiring