Hamish Gray, Developer in New Plymouth, Taranaki, New Zealand
Hamish is available for hire
Hire Hamish

Hamish Gray

Verified Expert  in Engineering

Data Modelling and GCP Developer

Location
New Plymouth, Taranaki, New Zealand
Toptal Member Since
August 23, 2021

Hamish Gray is an experienced data professional and data geek. He enjoys working with people and their data and solving business problems. Hamish is a google cloud platform specialist focused on data and analytics. He is a full-stack DataOps professional able to architect and build cloud data platforms, including full CI/CD delivery.

Portfolio

Powerco
GitLab CI/CD, Python, Tableau
Powerco
SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS)...
Zurich Financial Services Australia
IBM Db2, Informatica ETL, XML, erwin Data Modeler, MicroStrategy

Experience

Availability

Part-time

Preferred Environment

Google Cloud Platform (GCP), Google BigQuery, Terraform, SQL, Python, GitLab CI/CD, Data Modeling, Tableau

The most amazing...

...thing was starting with just a credit card and building a GCP platform that delivered over 300 builds to production in a year.

Work Experience

Senior Data Engineer

2017 - 2021
Powerco
  • Architected the Infrastructure as Code (IaC) in Terraform for creation of Powerco GCP and subsequent data lake platform, including IAM permissions, VCP host project and subnet design.
  • Designed an on-prem to cloud data extraction tool, made with Python; it allows scheduled extractions from ODBC databases, SharePoint, and other web APIs and file shares to GCP cloud storage for processing.
  • Created and managed the CI/CD process for the platform, running more than 250 pushes to production in 2020.
  • Led the migration of GCP services and data from the USA to Australia; multiple buckets and datasets, including reestablishment of Composer services in the region.
  • Implemented testing frameworks, Pytest and Behave and applied at both; deployed and built time of the platform.
  • Implemented Cloud Composer, Apache Airflow on GCP; included dynamic creation and disposal of dev/test layers for operation efficiency.
  • Led the cloud comparison to assess AWS vs GCP for data and analytics, presenting the findings and recommended GCP.
  • Analyzed platform spend and implemented optimization strategies leading to a 75% cost savings per month.
Technologies: GitLab CI/CD, Python, Tableau

BI Specialist

2015 - 2017
Powerco
  • Created and implemented an on-premise SQL server and Python data integration engine loading flat files and API data into generic structures, the success of which led directly to the GCP platform.
  • Used the Python and SQL server data integration platform files to be loaded and stored generically, requiring no code for initial loading. Enabled analysts to load data without engineering support.
  • Used Python and SQL platform to capture other API data such as weather, smart house, and EV charging. In addition, I supported research projects.
  • Led a number of hack days where business problems would be attempted to be solved in a day using the data integration engine and Jupyter Notebooks.
  • Maintained and managed pre-existing data warehouse and reporting solutions within Powerco.
  • Led the upgrade of a SQL 2008 data warehouse to SQL 2016. Did migration and automated testing of over 1000 SSIS packages.
  • Served as a key technical contact for the asset regulatory ledger reporting solution and asset modeling tools. Used by Powerco to report key financial, including regulatory asset base and asset disclosure statements to the authorities.
Technologies: SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), SQL Server 2016, Tableau, Python

BI Consultant

2013 - 2015
Zurich Financial Services Australia
  • Was responsible for the data warehouse solution for the corporate portfolio management team integrating a wide range of vast and rich data from the underwriting application Z-streamXpress.
  • Did outputs from the strike rate reporting directory contributed to a $1.5 million boost to BOP.
  • Worked on ETL Batch optimization program and delivery; key business reports were now available before 8 am, while the previous average was 4 pm.
  • Contributed to overall architecture, ETL design and analytic delivery; established the enterprise BI standards within Zurich Australia.
  • Designed and implemented semantic layer used for actuarial analysis and monitoring, including complex business requirements for leakage. and strike rate reporting.
  • Did modeling and integration of industry data sets; Redbook, Motor and. DUNS (Dun and Bradstreet).
Technologies: IBM Db2, Informatica ETL, XML, erwin Data Modeler, MicroStrategy

Technical Account Manager

2012 - 2013
Macquarie Bank
  • Was responsible for the service management of the microstrategy and TM1 applications in Macquarie Group.
  • Managed a team both on and off shore, including recruitment and training.
  • Implemented a number of performance boosts for the heavily used 24x7 general ledger system.
  • Architected a new microstrategy project for regulatory reporting - leading a second team of developers.
Technologies: MicroStrategy, Sybase, SQL

Reporting Manager

2010 - 2012
Zurich Financial Services Australia
  • Was responsible for the management and leadership of core BI systems within Zurich Australia.
  • Worked as both the reporting manager and microstrategy administrator across general insurance and investments, risk and vertically across sales, underwriting, and actuarial and claims functions.
  • Delivered key projects and BAU data warehouse and reporting activities.
Technologies: MicroStrategy, IBM Db2, Informatica ETL

Powerco Analytics Data Lake (PADL)

Delivered Google Cloud Platform (GCP) data lake and cloud data analytics solution for analytics at Powerco.

Using an Agile methodology delivering product owner requirements, the GCP platform has incrementally added functionally and value previously unattainable data.

Engaged directly with a range of people across Powerco from marketing to billing, operations and asset management strategy, the platform is now widely relied upon across the business.

With a focus on config over code, the platform was able to onboard other "less experienced" developers who could design and implement their own data pipelines for their business area. This allowed the core team to focus on improving the platform, adding features that others could leverage.

Python and SQL - Generic Data Ingestion Engine

The application loaded and parsed (via Pandas and Python) any text-based file into generic structures in a SQL server database. Users would provide basic information such as file name, type, and location, in the form of configuration and the system would consume and then parse into a basic tabular format.

This allowed the loading an analysis of multiple files, all relating to the same subject area without code. The application became the cornerstone of a number of "hack days" where users would bring their data and their required outcomes. Because the data loaded, automated time could be spent on analysis and exploration.

The tool led to a number of business outcomes included a 450K error found in pricing.

Pricing Inputs Model Solution

A Python and Flask app that combined billing and pricing information to produce immutable and auditable inputs for pricing modeling.

As part of an audit requirement, key inputs used to calculate pricing information were required to include lineage and logging on inputs. Using Python, a Flask app was built that would accept and record user inputs and gather data from source systems. A number of business rules were then implemented to create datasets that were used as input into the pricing model.

This allowed the auditor to fully track how inputs were created, what data was used, and the logic applied.

Z-Stream Express Data Warehouse

Initially, the project began with a batch optimization program. Changing the delivery of key business reports from 4 pm to 8 am. This involved a 150-hour data migration process to redesign a key fact table.

In doing this a number of inconsistencies in the data were discovered which would require reloading of data. However, as the source system was part of a fast flow underwriting system access to the database was only available from 4 am to 6 am each day. This required a careful recapture process.

New products and risks were constantly being added to the platform; these were required to be modeled and delivered for actuarial analysis. To support better analysis a semantic layer was created including the creation of key business measures of strike rates and leakage (discounting).

Lastly, a sustainable support model was created and operations were handled off-shore.
2011 - 2012

Master's Degree in Business Administration

University of New South Wales - Sydney, NSW, Australia

1995 - 1999

Bachelor's Degree in Business Administration

University of Waikato - Hamilton New Zealand

NOVEMBER 2019 - NOVEMBER 2021

GCP Professional Data Engineer

Google Cloud

SEPTEMBER 2017 - PRESENT

Introduction to Data Science in Python

Coursera

JANUARY 2017 - PRESENT

Big Data Specialization

Coursera

MAY 2016 - PRESENT

Machine Learning with Big Data

Coursera

Languages

SQL, Python, XML, GraphQL

Tools

Terraform, GitLab CI/CD, Informatica ETL, Cloud Dataflow, Tableau

Platforms

Google Cloud Platform (GCP), Firebase

Storage

SQL Server 2016, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), IBM Db2, JSON, Sybase

Other

Google BigQuery, MicroStrategy, Big Data, Data Modeling, Economics, Finance, People Management, Financial Management, Corporate Finance, erwin Data Modeler, Google Cloud Functions, Pub/Sub, Marketing Management, Machine Learning

Libraries/APIs

Pandas, NumPy

Frameworks

Flask

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring