Ted Tasker
Verified Expert in Engineering
Data Engineer and Software Developer
Columbus, OH, United States
Toptal member since November 2, 2022
Ted has over 25 years of working with Microsoft SQL and data products with a focus on data warehousing. He has been working primarily in Azure for the last six years, and his current focus is Synapse Analytics workspace. Often in the role of the lead architect, Ted is very familiar with all the supporting Azure technologies to deliver a modern data platform in a secure, performant, cost-efficient way. He is a Certified Azure Data Engineer and Azure Database Administrator.
Portfolio
Experience
Availability
Preferred Environment
Azure Synapse, Azure SQL, Azure Data Factory, Azure Data Lake, Microsoft SQL Server, Azure SQL Data Warehouse, Dedicated SQL Pool (formerly SQL DW)
The most amazing...
...code I've written is a utility to tune an MPP database, which led to an approved patent for systems and methods to distribute data and reduce computing load.
Work Experience
Senior Data Architect | Columbus Data and AI Practice Manager
Insight
- Developed a utility to monitor, test, recommend and apply changes to optimize performance and lower compute cost on Synapse dedicated pool. Insight filed for a patent, and I was listed as the sole inventor.
- Acted as an architect and developer lead for a migration of an on-premise complex UI-driven ETL/DW system to an Azure-hosted PaaS modern data platform using the Synapse workspace. Reduced costs by allowing the right computing for the right workload.
- Designed and loaded a corporate data lake and relational database as the single source of truth with a metadata-driven subscription service. Business units can subscribe to data they need and have it auto-published into their own Synapse workspace.
Senior Data Architect
York Risk
- Acted as a technical lead for a team formed to implement a suite of Informatica products as a data platform for all of York. The team oversaw strategy and initial implementation related to MDM, EDW, data ingestion, and data governance.
- Implemented automated SQL server setup processes to implement encryption, auditing, scheduled backups, and security reporting. This allowed me to act as primary DBA over 12 instances of SQL Server with only limited time required to oversee.
- Led MDM implementation, handling numerous entities like insurers, carriers, customers, and adjusters. Sourced from multiple claim systems and HR, accounting, and CRM systems. Conducted rule creation and extension of base Informatica MDM model.
SVP Data Analytics
PriorAuthNow (rebranded Rhyme)
- Created a DB system to integrate hospital EMR JSON by leveraging customer-specific mapping and translation metadata to programmatically generate customer-specific stored procedures to consume 20,000 messages per hour on an inexpensive Azure SQL DB.
- Built a rule engine using transactional data and third-party data to determine if prior authorization was required. Nightly centralized processing pushed rules to the OLTP SQL system; real-time adjustments to rules based on transaction details.
- Developed a comprehensive set of dashboards, reports, and self-service analytics. Developed all aspects of ETL, data model design, and reporting. Azure SQL DBs and App Insights via data lake were the primary sources of all data for the DW.
Chief Technology Officer
BI Voyage Consulting
- Acted as a CTO of a small Microsoft Gold partner specializing in DW and BI. My duties included technical oversight on all projects, responsibility for technical staffing, and very often acting as lead architect on services engagements.
- Served as an APS/PDW industry trainer. I was contracted multiple times by Microsoft and HP to deliver APS training in Canada, the US, and Central and South America. The Synapse dedicated pool is based on APS/PDW, so it's still very relevant.
- Designed and developed an Inmon approach DW with the centralized repository residing on Hadoop and the data marts being hosted on PDW with BI based on SharePoint and SQL 2012. The client was the Digital Crimes Unit, an internal unit of Microsoft.
Technology Solution Professional
Microsoft
- Led a POC that led to the first sale in the world of Microsoft PDW. I was responsible for all aspects—migrating schema, loading and testing data, concurrency, and performance tuning. Proof of concept was done on a 68TB data mart.
- Created a T-SQL tool that automates the migration of an SMP SQL Server database to PDW. Tool generated optimized PDW schema, BCP out data, and generated load scripts for PDW. Microsoft raised the limit from ten tables for a migration POC to 100.
- Assisted the customer with a FastTrack DW (SQL/HP) appliance, which replaced an existing DB2 data warehouse that was 5TB in size. Focused on performance tuning and assistance with partitions and partition switching strategies to optimize load.
FVP Business Intelligence
Countrywide Home Mortgage
- Architected and developed a solution for HR strategic and operational scorecards. The solution leveraged Microsoft PerformancePoint, SharePoint, Analysis Services, and SSIS.
- Designed a generic KPI mart approach that provided a framework for managing and tracking KPIs that led to simple maintainability and detailed reporting.
- Led the approval process to get Microsoft PerformancePoint and SharePoint approved by CTO and architecture review board as corporate solutions. This included C-level presentations, POC, and significant research and documentation.
FVP Data Warehouse
IndyMac Bank
- Established and managed all aspects of database architecture for the first two releases of a 9-release, $20-million program to implement a new loan origination system with data modeling and mapping, SQL dev, BizTalk, design process, and procedures.
- Led a department of 14 developers and analysts to support internal business units by providing daily operational reporting and strategic decision-making analytics. Design and deployment of the digital dashboard, available to 2,500 employees.
- Architected and implemented the company data bus (BizTalk, SSIS, and SQL DB) to allow the exchange and consolidation of key OLTP data across internal and external systems. Eliminated complexity and improved consistency of corporate data operations.
Director of Business Systems Applications
Walt Disney Internet Group
- Served as the chief architect and project manager of a Kimball approach data warehouse with data marts such as online marketing, eCommerce, site traffic, and ad serving. The system went live in production the same day SQL 7 was released.
- Acted as the department lead with the following direct reports—ETL team lead, BusinessObjects team lead, data modeling team lead, and ad management team lead. Led maintenance and expansion of DW and business intelligence platforms.
- Created a metadata reporting solution. Developers created a stored procedure and added basic metadata and system-rendered reports with a generic ASP page. The solution met the business' immediate needs, allowing time for developing a mature DW.
Experience
Netezza to Azure Synapse Migration
I led the architecture approach of a corporate data lake and a metadata-driven system for business units to subscribe to data and compute needs for their own Synapse workspace. Data could be imported or linked as external. DB pool could be sized and scheduled to meet the BU needs.
Designed the metadata repository that controlled all workflow and processing of data from source to corporate data lake to BU analytic marts. I also managed the repository, developed stored procedures that notebooks called, and adjusted the design as needed to meet business needs.
Wrote the generic code to implement DDL for a new or changed subscription. Databricks notebooks would create incremental data sets for loading to business unit Synapse DBS. I created the Synapse stored procedures, which dynamically created the required polybase commands for fast loads. The system had detailed tracking of load capabilities, including storing the generated dynamic SQL for easy debugging.
The new design allows easy tracking of costs and business value of data.
Synapse Dedicated Pool Performance Tuning
I also reviewed the entire database and made over 40 changes to table distribution models and storage types to align with best practice recommendations from Microsoft. I created scheduled maintenance routines to improve the performance of execution times.
At the end of the project, the customer was able to reduce their computing cost by 33% while getting performance execution times.
SQL EMR JSON Pre-authorization Processing
Different hospitals had different formats; I created an Excel spreadsheet for analysts to map a specific customer's JSON to the transactional system. Once completed, the Excel was imported as metadata to the SQL system.
Initial testing showed the logic for translation could be built correctly as dynamic SQL. However, the performance of dynamic building logic was too slow to keep up with 20,000 EMR JSON blobs an hour.
I pivoted the code from building dynamic SQL within procedures to leveraging the metadata to generate hard-coded procedures with all the logic in place. Once an analyst uploaded a new mapping or corrected one, they could kick off a process that would generate a new stored procedure based on the originator of the JSON.
A master stored procedure fired for every JSON blob that hit the stage, originating hospital also available; the master procedure would dynamically execute the custom proc, passing in the ID of the JSON to be processed.
Enterprise Management of SQL Auditing and Data Masking
I created a system using a low-powered centralized management SQL server. A metadata repository holds information about what audit requirements are for what tables, what server-level auditing, what data masking to apply, and where.
The central server would ensure all other servers had current requirements applied by leveraging linked servers to the centralized management server nightly. If a new server had been added, the nightly job would provision all aspects of setting up the auditing and data masking.
The central server also collected all audit logs from all servers nightly. It combined them into a single high-level report that would list any audit infractions if, for example, the database administrator altered their permissions, any changes to users, and high-level info about the number of users and tables queried the previous day.
The creation of this system allowed a single DBA to manage multiple servers simply and easily, ensuring corporate compliance.
Education
Associate Degree in Computer Programming
Algonquin College - Ottawa, Ontario, Canada
Certifications
AWS Solutions Architect Associate
Amazon Web Services
Azure Database Administrator
Microsoft
Azure Data Engineer
Microsoft
MCSA: SQL Server 2012/2014
Microsoft
Skills
Tools
Synapse, Azure Key Vault, Informatica PowerCenter, Microsoft Power BI, SQL Server BI, IBM Cognos
Languages
SQL, T-SQL (Transact-SQL), Snowflake
Paradigms
ETL, Database Design
Platforms
Azure Synapse, Azure SQL Data Warehouse, Azure, Dedicated SQL Pool (formerly SQL DW), SharePoint
Storage
Azure SQL, Microsoft Parallel Data Warehouse (PDW), Microsoft SQL Server, Data Lakes, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), SQL Server Analysis Services (SSAS), SQL Server 2014, Google Cloud, JSON
Frameworks
Apache Spark, Axon Framework
Other
Azure Data Lake, Microsoft Analytics Platform System (APS), Data Warehouse Design, Data Engineering, Data Architecture, Data Warehousing, Data Modeling, Azure Data Factory, Architecture, Query Optimization, Big Data, Serverless, Azure Analysis Services, Informatica, IBM InfoSphere, Informatica Data Quality, Programming, SSRS Reports, Microsoft Data Transformation Services (now SSIS), SAP BusinessObjects (BO), AWS Cloud Architecture, FastTrack
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.
Start hiring