Ted Tasker, Developer in Columbus, OH, United States
Ted is available for hire
Hire Ted

Ted Tasker

Verified Expert  in Engineering

Data Engineer and Software Developer

Location
Columbus, OH, United States
Toptal Member Since
November 2, 2022

Ted has over 25 years of working with Microsoft SQL and data products with a focus on data warehousing. He has been working primarily in Azure for the last six years, and his current focus is Synapse Analytics workspace. Often in the role of the lead architect, Ted is very familiar with all the supporting Azure technologies to deliver a modern data platform in a secure, performant, cost-efficient way. He is a Certified Azure Data Engineer and Azure Database Administrator.

Portfolio

Insight
Azure Synapse, Azure SQL, Azure Data Lake, Apache Spark, Azure Data Factory...
York Risk
SQL, Google Cloud, Informatica, IBM InfoSphere, Axon Framework...
PriorAuthNow (rebranded Rhyme)
Azure SQL, Microsoft Power BI, SQL Server Integration Services (SSIS)...

Experience

Availability

Part-time

Preferred Environment

Azure Synapse, Azure SQL, Azure Data Factory, Azure Data Lake, Microsoft SQL Server, Azure SQL Data Warehouse, Dedicated SQL Pool (formerly SQL DW)

The most amazing...

...code I've written is a utility to tune an MPP database, which led to an approved patent for systems and methods to distribute data and reduce computing load.

Work Experience

Senior Data Architect | Columbus Data and AI Practice Manager

2019 - PRESENT
Insight
  • Developed a utility to monitor, test, recommend and apply changes to optimize performance and lower compute cost on Synapse dedicated pool. Insight filed for a patent, and I was listed as the sole inventor.
  • Acted as an architect and developer lead for a migration of an on-premise complex UI-driven ETL/DW system to an Azure-hosted PaaS modern data platform using the Synapse workspace. Reduced costs by allowing the right computing for the right workload.
  • Designed and loaded a corporate data lake and relational database as the single source of truth with a metadata-driven subscription service. Business units can subscribe to data they need and have it auto-published into their own Synapse workspace.
Technologies: Azure Synapse, Azure SQL, Azure Data Lake, Apache Spark, Azure Data Factory, Dedicated SQL Pool (formerly SQL DW), Azure SQL Data Warehouse, Azure Key Vault, Azure Analysis Services, Database Design, Architecture, Microsoft SQL Server, SQL, Query Optimization, Data Warehouse Design, Data Engineering, ETL, Data Lakes, Data Architecture, Big Data, Snowflake, Azure, Serverless, Synapse

Senior Data Architect

2018 - 2019
York Risk
  • Acted as a technical lead for a team formed to implement a suite of Informatica products as a data platform for all of York. The team oversaw strategy and initial implementation related to MDM, EDW, data ingestion, and data governance.
  • Implemented automated SQL server setup processes to implement encryption, auditing, scheduled backups, and security reporting. This allowed me to act as primary DBA over 12 instances of SQL Server with only limited time required to oversee.
  • Led MDM implementation, handling numerous entities like insurers, carriers, customers, and adjusters. Sourced from multiple claim systems and HR, accounting, and CRM systems. Conducted rule creation and extension of base Informatica MDM model.
Technologies: SQL, Google Cloud, Informatica, IBM InfoSphere, Axon Framework, Informatica Data Quality, Informatica PowerCenter, Microsoft SQL Server, Data Engineering, ETL, Data Lakes, Data Architecture, Data Warehousing

SVP Data Analytics

2016 - 2018
PriorAuthNow (rebranded Rhyme)
  • Created a DB system to integrate hospital EMR JSON by leveraging customer-specific mapping and translation metadata to programmatically generate customer-specific stored procedures to consume 20,000 messages per hour on an inexpensive Azure SQL DB.
  • Built a rule engine using transactional data and third-party data to determine if prior authorization was required. Nightly centralized processing pushed rules to the OLTP SQL system; real-time adjustments to rules based on transaction details.
  • Developed a comprehensive set of dashboards, reports, and self-service analytics. Developed all aspects of ETL, data model design, and reporting. Azure SQL DBs and App Insights via data lake were the primary sources of all data for the DW.
Technologies: Azure SQL, Microsoft Power BI, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), Azure Analysis Services, Azure SQL Data Warehouse, Dedicated SQL Pool (formerly SQL DW), Microsoft SQL Server, SQL, Query Optimization, Data Engineering, ETL, Data Lakes, Data Architecture, Data Warehousing, Data Modeling, Azure

Chief Technology Officer

2012 - 2016
BI Voyage Consulting
  • Acted as a CTO of a small Microsoft Gold partner specializing in DW and BI. My duties included technical oversight on all projects, responsibility for technical staffing, and very often acting as lead architect on services engagements.
  • Served as an APS/PDW industry trainer. I was contracted multiple times by Microsoft and HP to deliver APS training in Canada, the US, and Central and South America. The Synapse dedicated pool is based on APS/PDW, so it's still very relevant.
  • Designed and developed an Inmon approach DW with the centralized repository residing on Hadoop and the data marts being hosted on PDW with BI based on SharePoint and SQL 2012. The client was the Digital Crimes Unit, an internal unit of Microsoft.
Technologies: Microsoft Parallel Data Warehouse (PDW), Microsoft Analytics Platform System (APS), SQL Server Reporting Services (SSRS), SQL, Microsoft SQL Server, Query Optimization, Data Warehouse Design, Data Engineering, ETL, Data Architecture, Big Data, Data Warehousing

Technology Solution Professional

2008 - 2011
Microsoft
  • Led a POC that led to the first sale in the world of Microsoft PDW. I was responsible for all aspects—migrating schema, loading and testing data, concurrency, and performance tuning. Proof of concept was done on a 68TB data mart.
  • Created a T-SQL tool that automates the migration of an SMP SQL Server database to PDW. Tool generated optimized PDW schema, BCP out data, and generated load scripts for PDW. Microsoft raised the limit from ten tables for a migration POC to 100.
  • Assisted the customer with a FastTrack DW (SQL/HP) appliance, which replaced an existing DB2 data warehouse that was 5TB in size. Focused on performance tuning and assistance with partitions and partition switching strategies to optimize load.
Technologies: Microsoft Parallel Data Warehouse (PDW), SQL Server Reporting Services (SSRS), SharePoint, SQL Server Integration Services (SSIS), Microsoft SQL Server, SQL, Query Optimization, Data Warehouse Design, Data Engineering, ETL, Big Data, Data Warehousing

FVP Business Intelligence

2007 - 2008
Countrywide Home Mortgage
  • Architected and developed a solution for HR strategic and operational scorecards. The solution leveraged Microsoft PerformancePoint, SharePoint, Analysis Services, and SSIS.
  • Designed a generic KPI mart approach that provided a framework for managing and tracking KPIs that led to simple maintainability and detailed reporting.
  • Led the approval process to get Microsoft PerformancePoint and SharePoint approved by CTO and architecture review board as corporate solutions. This included C-level presentations, POC, and significant research and documentation.
Technologies: SQL Server BI, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), SharePoint, Microsoft SQL Server, SQL, Query Optimization, Data Warehouse Design, Data Engineering, ETL, Data Architecture, Data Warehousing, Data Modeling

FVP Data Warehouse

2000 - 2007
IndyMac Bank
  • Established and managed all aspects of database architecture for the first two releases of a 9-release, $20-million program to implement a new loan origination system with data modeling and mapping, SQL dev, BizTalk, design process, and procedures.
  • Led a department of 14 developers and analysts to support internal business units by providing daily operational reporting and strategic decision-making analytics. Design and deployment of the digital dashboard, available to 2,500 employees.
  • Architected and implemented the company data bus (BizTalk, SSIS, and SQL DB) to allow the exchange and consolidation of key OLTP data across internal and external systems. Eliminated complexity and improved consistency of corporate data operations.
Technologies: SQL Server BI, SharePoint, SQL Server Integration Services (SSIS), SSRS Reports, IBM Cognos, Microsoft SQL Server, SQL, Query Optimization, Data Warehouse Design, Data Engineering, ETL, Data Architecture, Data Warehousing, Data Modeling

Director of Business Systems Applications

1997 - 2000
Walt Disney Internet Group
  • Served as the chief architect and project manager of a Kimball approach data warehouse with data marts such as online marketing, eCommerce, site traffic, and ad serving. The system went live in production the same day SQL 7 was released.
  • Acted as the department lead with the following direct reports—ETL team lead, BusinessObjects team lead, data modeling team lead, and ad management team lead. Led maintenance and expansion of DW and business intelligence platforms.
  • Created a metadata reporting solution. Developers created a stored procedure and added basic metadata and system-rendered reports with a generic ASP page. The solution met the business' immediate needs, allowing time for developing a mature DW.
Technologies: SQL Server BI, Microsoft Data Transformation Services (now SSIS), SAP BusinessObjects (BO), Microsoft SQL Server, SQL, Query Optimization, Data Warehouse Design, Data Engineering, ETL, Data Architecture, Data Warehousing, Data Modeling

Netezza to Azure Synapse Migration

A Fortune 20 healthcare company migrated on-premise database appliances to Azure, including 4,000 tables with a mix of corporate data warehouse sources of truth and business unit (BU) analytic sandboxes.

I led the architecture approach of a corporate data lake and a metadata-driven system for business units to subscribe to data and compute needs for their own Synapse workspace. Data could be imported or linked as external. DB pool could be sized and scheduled to meet the BU needs.

Designed the metadata repository that controlled all workflow and processing of data from source to corporate data lake to BU analytic marts. I also managed the repository, developed stored procedures that notebooks called, and adjusted the design as needed to meet business needs.

Wrote the generic code to implement DDL for a new or changed subscription. Databricks notebooks would create incremental data sets for loading to business unit Synapse DBS. I created the Synapse stored procedures, which dynamically created the required polybase commands for fast loads. The system had detailed tracking of load capabilities, including storing the generated dynamic SQL for easy debugging.

The new design allows easy tracking of costs and business value of data.

Synapse Dedicated Pool Performance Tuning

A manufacturing company was failing to meet its data warehouse SLA because of slow-running loads. I applied best practices for SQL on a massively parallel processing system. A stored procedure for loading data that had been taking over four hours was rewritten to run in 20 minutes.

I also reviewed the entire database and made over 40 changes to table distribution models and storage types to align with best practice recommendations from Microsoft. I created scheduled maintenance routines to improve the performance of execution times.

At the end of the project, the customer was able to reduce their computing cost by 33% while getting performance execution times.

SQL EMR JSON Pre-authorization Processing

The company had an existing UI and back-end system to manage clients' prior medical authorizations. Larger hospital systems did not want to manually create through UI but submit EMR JSON records through a REST API that inserted the JSON as a blob into a staging table.

Different hospitals had different formats; I created an Excel spreadsheet for analysts to map a specific customer's JSON to the transactional system. Once completed, the Excel was imported as metadata to the SQL system.

Initial testing showed the logic for translation could be built correctly as dynamic SQL. However, the performance of dynamic building logic was too slow to keep up with 20,000 EMR JSON blobs an hour.

I pivoted the code from building dynamic SQL within procedures to leveraging the metadata to generate hard-coded procedures with all the logic in place. Once an analyst uploaded a new mapping or corrected one, they could kick off a process that would generate a new stored procedure based on the originator of the JSON.

A master stored procedure fired for every JSON blob that hit the stage, originating hospital also available; the master procedure would dynamically execute the custom proc, passing in the ID of the JSON to be processed.

Enterprise Management of SQL Auditing and Data Masking

A corporation wanted to apply and manage standardized auditing practices across its SQL servers. Creating auditing and data masking is often done through SQL server UIs, but it can also be scripted with T-SQL.

I created a system using a low-powered centralized management SQL server. A metadata repository holds information about what audit requirements are for what tables, what server-level auditing, what data masking to apply, and where.

The central server would ensure all other servers had current requirements applied by leveraging linked servers to the centralized management server nightly. If a new server had been added, the nightly job would provision all aspects of setting up the auditing and data masking.

The central server also collected all audit logs from all servers nightly. It combined them into a single high-level report that would list any audit infractions if, for example, the database administrator altered their permissions, any changes to users, and high-level info about the number of users and tables queried the previous day.

The creation of this system allowed a single DBA to manage multiple servers simply and easily, ensuring corporate compliance.

Languages

SQL, T-SQL (Transact-SQL), Snowflake

Tools

Synapse, Azure Key Vault, Informatica PowerCenter, Microsoft Power BI, SQL Server BI, IBM Cognos

Paradigms

ETL, Database Design

Platforms

Azure Synapse, Azure SQL Data Warehouse, Azure, Dedicated SQL Pool (formerly SQL DW), SharePoint

Storage

Azure SQL, Microsoft Parallel Data Warehouse (PDW), Microsoft SQL Server, Data Lakes, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), SQL Server Analysis Services (SSAS), SQL Server 2014, Google Cloud, JSON

Other

Azure Data Lake, Microsoft Analytics Platform System (APS), Data Warehouse Design, Data Engineering, Data Architecture, Data Warehousing, Data Modeling, Azure Data Factory, Architecture, Query Optimization, Big Data, Serverless, Azure Analysis Services, Informatica, IBM InfoSphere, Informatica Data Quality, Computer Programming, SSRS Reports, Microsoft Data Transformation Services (now SSIS), SAP BusinessObjects (BO), AWS Cloud Architecture

Frameworks

Apache Spark, Axon Framework

1990 - 1992

Associate Degree in Computer Programming

Algonquin College - Ottawa, Ontario, Canada

JANUARY 2023 - PRESENT

AWS Solutions Architect Associate

Amazon Web Services

OCTOBER 2020 - PRESENT

Azure Database Administrator

Microsoft

OCTOBER 2020 - PRESENT

Azure Data Engineer

Microsoft

NOVEMBER 2012 - PRESENT

MCSA: SQL Server 2012/2014

Microsoft

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring