Christopher Copeland-Gray, Developer in London, United Kingdom
Christopher is available for hire
Hire Christopher

Christopher Copeland-Gray

Verified Expert  in Engineering

Database Developer

London, United Kingdom

Toptal member since March 9, 2021

Bio

Chris has 20 years of experience providing data services to companies in a range of sectors. Chris excels in SQL development and data architecture, modeling, and governance. He has been involved in several large system developments, including a pan-European data warehouse, a system to manage the movement of hydrocarbons worldwide, and the design and development from scratch of an Azure-based data platform, along with the team to support and maintain it.

Portfolio

NHS Property Services
Azure, Azure SQL Databases, Business Intelligence (BI), Database Design...
Shell
Microsoft SQL Server, Oracle, Database Modeling, Logical Database Design
Gazprom
SQL, Microsoft SQL Server, SQL Server Integration Services (SSIS)...

Experience

  • SQL - 20 years
  • Business Intelligence (BI) - 17 years
  • Database Design - 14 years
  • Microsoft SQL Server - 12 years
  • Database Modeling - 12 years
  • Azure SQL Databases - 5 years
  • Azure - 5 years
  • Data Governance - 4 years

Availability

Part-time

Preferred Environment

Windows

The most amazing...

...project I've developed is an Azure enterprise data platform for a 6,000 strong organization from scratch that integrated with line of business systems and M365.

Work Experience

Enterprise Data Architect

2016 - 2021
NHS Property Services
  • Designed and (largely) built the enterprise data platform. Justified the concept, hired the development team, and integrated all the major line of business systems, and built reconciliations between the sources and ODS.
  • Initiated and led on the data quality, data governance, and M365 strategies.
  • Developed the low code strategy, integrating power apps and power automate into the existing Azure technology stack, utilizing Teams as a single pane of glass along with SharePoint integration.
  • Founded and grew the architecture function to lead on the solutioning of all demands via the technical design authority, including the configuration of the enterprise metamodel in Abacus and architectural reporting.
  • Identified cost savings in the range of £2 million over three years by merging different solutions on to the same infrastructure, decommissioning applications that duplicated capabilities, and championing development of functionality that solved multiple needs.
  • Defined the method for creating a unified taxonomy to catalog data artifacts for structured, semi-structured, and unstructured data sets, including the infrastructure to allow the business to search for terms across all sources.
  • Modeled the business to produce the enterprise logical model (over 250 entities), ensured traceability of this model to the ODS, integrated this with the EA tool (Abacus).
  • Designed and ensured the implementation of data lineage between distributed cloud storage products (data lake files, SharePoint, blob storage, and relational data stores), full auditing of each operation, and proactive monitoring of each component.
  • Defined the data landscape, specified, then delivered on the 3-year data strategy moving the organization from an Excel-based landscape to incorporating cloud databases, data lakes, and MDM capabilities.
Technologies: Azure, Azure SQL Databases, Business Intelligence (BI), Database Design, Database Modeling, Data Governance, Data Quality Analysis

Data Architect

2012 - 2016
Shell
  • Migrated a dozen systems (SQL Server and Oracle) into a global system to manage the shipping business's assurance side in relation to oil, chemical, and LNG products, including ownership of the logical and physical models.
  • Analyzed the architectural options for redesigning group working capital and cash utilization reporting and developing the solution I recommended.
  • Redesigned the reporting layer for the budgeting warehouse, which allowed the development of accurate sub-second reports and a comprehensive auditing strategy.
  • Developed database deployment automation strategies and built the time-saving architecture to do this at crunch points, allowing one person to deploy ten builds a day across seven environment layers with full master data consistency.
  • Provided rapid solutions to data problems, e.g., when code in pre-production started to fail at 17:00 on a Friday with the production release the next day, I managed to resolve a timeout by tuning a query taking 30 minutes to execute in one second.
Technologies: Microsoft SQL Server, Oracle, Database Modeling, Logical Database Design

Data Warehouse Architect and Business Intelligence Developer

2011 - 2011
Gazprom
  • Enhanced the existing data warehouse ETL and data mart design for the integration project delivering automated transfer of profit/loss positions and APAR transactions into SAP.
  • Implemented changes to the Endur exposure and cash flow SSIS and SSAS OLAP solutions, which allowed the requirements to be delivered to aggressive timescales.
  • Developed in a day a proof-of-concept pipeline SQL-based ETL process to load source system data, also used to identify reference data quality issues early in the project.
  • Designed and implemented the reconciliation sub-system via TSQL and SSRS reports that analysed the source systems, ODS, and SAP before reporting errors to the business.
  • Refactored queries that were executing in 2-3 hours to run in less than one minute by changing the query structures and using materialization to handle growing data volumes.
Technologies: SQL, Microsoft SQL Server, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS)

Data Warehouse Lead Architect

2007 - 2010
E.ON Energy Trading
  • Managed the technical responsibility for all warehousing projects, which consisted of four streams and twenty resources across three countries. Led the design to transform the UK trading system into a pan-European, multi-terabyte solution.
  • Designed the ODS data structures and reporting data marts using multiple facts and over a dozen dimension tables, with multiple grains and millions of rows with continuous update.
  • Created a solution to report to the business the transformed data accuracy was 100% and that aggregated values, at multiple grains, matched source values 100%.
  • Redesigned the data marts to accommodate major changes in requirements, and provided 24/7 availability handling multiple time zone reporting.
  • Performed technical oversight for the creation of a data warehouse for calculating trade P&L, exposure and overall positions, and incorporating asset and transportation costs from multiple data sources.
  • Created the historical data load mechanism taking into consideration the multitude of data sources (including Oracle and SQL Server), achieving a repeatable load and handling hardware limitations (one node with two CPUs and limited disk space).
Technologies: SQL, Oracle, Oracle Warehouse Builder (OWB), C#, Microsoft SQL Server

Database Developer

2004 - 2007
GCHQ
  • Developed and administered two very large data warehouses using several 32-CPU nodes in a RAC cluster. Managed the large-scale ETL design using parallel loading, external tables, and SQL*Loader.
  • Brought significant performance tuning experience on data warehouses, e.g., writing project critical queries to join large amounts of data across three databases, resolving an existing hierarchy of legacy views that had inefficient hints built into them.
  • Developed Java software, PL/SQL, and UNIX scripts to interact with these databases for business-specific OLAP processes.
  • Improved legacy systems, e.g., I increased the lifetime of a multi-terabyte database by improving the ETL load time by a factor of six.
Technologies: SQL, Oracle, Microsoft SQL Server, SQL Performance

Experience

Azure Data Architecture

I created an enterprise data platform integrating six major line of business systems into the operational data store, through to the dimensional layer, provision of data governance and data quality analysis, and the support of a Power BI reporting framework. The system supported full version history, replay capabilities, and data lineage analysis of each data item from the source to the data lake data warehouse and Power BI report(s). It also included the design of a reporting layer that provided management information and curated BI reports, as well as self-service BI capabilities.

Pan-European Data Warehouse

When combining several subsidiaries' trading operations into one single energy trading business, a large European energy provider asked me to design and build their new trading data warehouse. This extracted data from all the subsidiary systems spread across Europe brought the data together into a single-dimensional data model (with 14 dimensions). This was performance-tuned so that the P&L position was refreshed every 15 minutes, including reconciliation between all sources and the data marts. The system collated position history so that data extracts could be replayed easily when errors were detected.

Large Database Tuning Project

I was asked to analyze a legacy system that was scheduled to be decommissioned due to its' inability to load data at modern transfer rates. After a forensic analysis, I was able to tweak the ETL process and the data structures to support faster load and reporting times. With the ETL running six times as fast as before, it was agreed the lifetime of the system would be extended by two years.

Financial Working Capital Reporting

I provided desk-level reporting on working capital for a major FTSE constituent. This included business and data analysis (to understand the existing undocumented data processing), SQL development (to create data stores for intermediate processed data), and also new working capital reports. During this process, I uncovered $100 million that had been missed out of desk-level reports due to an unrecognized quirk in the existing report filtering.

Data Modeling for Hydrocarbon Transportation

I designed two-thirds of a data model (over 200 entities) for a new system to automate the compliance processes for approving cargo movement from one port to another. This involved the verification of over 10,000 data points per trip, alerting to the risk implied by different types of cargo through each regulatory area, itinerary, vessel, port, and management organization. I was further responsible for designing the process for adding, updating, and moving master data through all seven environment stacks and designing the overall deployment process.

Education

2003 - 2004

Master's Degree in Advanced Computer Science

University of Manchester - Manchester, UK

2000 - 2003

Bachelor of Science Degree (Honors) in Software Engineering

Birmingham City University - Birmingham, UK

Certifications

APRIL 2020 - APRIL 2022

Azure Solution Architect Expert

Microsoft

APRIL 2019 - PRESENT

TOGAF 9.2

The Open Group

APRIL 2012 - PRESENT

Microsoft Certified Professional

Microsoft

NOVEMBER 2010 - PRESENT

Microsoft Professional Database Professional

Microsoft

NOVEMBER 2010 - PRESENT

Microsoft Certified Technology Specialist

Microsoft

APRIL 2007 - PRESENT

Oracle SQL Expert

Oracle

APRIL 2006 - PRESENT

Oracle Certified Professional

Oracle

Skills

Tools

Microsoft Power BI, Oracle Warehouse Builder (OWB), Azure Logic Apps

Languages

SQL, Java, C#

Paradigms

Business Intelligence (BI), Database Design, ETL

Storage

Microsoft SQL Server, Database Modeling, Azure SQL Databases, SQL Performance, Oracle SQL, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), Oracle 12c, SQL Server 2014

Platforms

Azure, Oracle, Windows, Azure Functions, SAP HANA

Frameworks

TOGAF

Other

Solution Design, Logical Database Design, Solutioning, Software Design, Data Governance, Data Quality Analysis, Artificial Intelligence (AI)

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring