Patrick Cockwell, Developer in Chiang Mai, Thailand
Patrick is available for hire
Hire Patrick

Patrick Cockwell

Verified Expert  in Engineering

Full-stack Developer

Location
Chiang Mai, Thailand
Toptal Member Since
April 6, 2020

Patrick is a full-stack developer specializing in data engineering, ETL processes, database design and management, analytics, and infrastructure. He's worked with Apache Airflow, Kubernetes and is familiar with GCP and AWS platforms. Patrick has great attention to detail and prefers designing systems for automation and extensibility. Patrick is a strong developer with knowledge of Python, Ruby, PHP, JavaScript, HTML, CSS, SQL, and Terraform.

Portfolio

OpenCraft
Amazon Web Services (AWS), SQL, Redux, React, AWS IAM, MongoDB Atlas, MongoDB...
Flyr Labs, Inc.
Data Engineering, Python, Kubernetes, Terraform, Automation Tools, DevOps, ETL...
Agari Data, Inc.
Terraform, DevOps, SQL, Datadog, Consul, CSS, HTML, JavaScript, PostgreSQL...

Experience

Availability

Part-time

Preferred Environment

Amazon Web Services (AWS), Python, Terraform, Google Cloud Platform (GCP), PostgreSQL, Apache Airflow, Ruby on Rails (RoR), Ruby, GitHub, Git, Slack, Sublime Text, MacOS

The most amazing...

...thing I've developed is a custom database schema used to normalize varying customer datasets into a common data model for use with data science applications.

Work Experience

Developer

2020 - 2020
OpenCraft
  • Developed Terraform modules to encapsulate infrastructure definitions and allow rapid infrastructure deployment for new clients.
  • Contributed to React and Redux web applications for the edX front-end projects, and Python and Django application logic for the core edX platform (and other) projects.
  • Assisted in implementing multiple monitoring and reporting mechanisms for OpenCraft systems to help track and improve client conversions and resource management.
Technologies: Amazon Web Services (AWS), SQL, Redux, React, AWS IAM, MongoDB Atlas, MongoDB, Terraform, Open Source, MySQL, Django REST Framework, Django, Python, GitHub, Automation Tools, Docker, DevOps, PostgreSQL

Data and DevOps Engineer

2018 - 2020
Flyr Labs, Inc.
  • Redesigned and rebuilt the Flyr infrastructure on GCP using Terraform to provide permissions-based security, data and application isolation, and ease of deployment.
  • Built an ETL platform from scratch using Python, Apache Airflow, and Kubernetes that runs dozens of jobs and transfers hundreds of GBs of data per day in a performant and data idempotent manner.
  • Assisted in the design, development, and implementation of a custom data model to transform client data into a uniform format to be used by internal data science applications.
Technologies: Data Engineering, Python, Kubernetes, Terraform, Automation Tools, DevOps, ETL, Database Administration (DBA), Google BigQuery, SQL, Google Cloud Platform (GCP), Apache Airflow, GitHub, Cloud Architecture, Google Kubernetes Engine (GKE), Docker, Google Cloud Storage, PostgreSQL

Software/DevOps Developer

2016 - 2018
Agari Data, Inc.
  • Performed extensive IP space research to ensure that the nternal data stores were correct.
  • Guided the design, development, and execution of a disaster recovery plan for the AWS infrastructure and ensure stability, redundancy, and recoverability of data, infrastructure, and applications.
  • Compared logging solutions for long-term log storage, parses, and searches.
  • Developed multiple web interfaces and supported an application for an email cybersecurity tool.
Technologies: Terraform, DevOps, SQL, Datadog, Consul, CSS, HTML, JavaScript, PostgreSQL, Ruby on Rails (RoR), Disaster Recovery Plans (DRP), GitHub, Cloud Architecture, Logging, Amazon S3 (AWS S3), Amazon Web Services (AWS)

Software Engineer

2015 - 2016
Breeze
  • Developed multiple integrations with complex external APIs to gather data ranging from credit reports to vehicle locations and mileage.
  • Managed and enhanced the entirety of the data and analytics infrastructure.
  • Built tools to improve the accuracy and reliability of reports to external interests and parties.
Technologies: Python, DevOps, ETL, Database Administration (DBA), SQL, PostgreSQL, Heroku, GitHub, Automation Tools

Airflow ETL Platform and Canonical Data Model

I helped design, vet, and implement a robust data model for use with data from various clients in the airline industry.

Each client has unique internal representations of the data, which are transmitted to the organization data stores on varying schedules and frequencies using an Airflow ETL platform. The data is then processed and stored into the canonical data model (CDM) such that the machine learning (ML) models generated by the data science team can operate on the data.

This CDM is used across all clients and requires data idempotent and custom transformations for each client.

Complex Infrastructure Management System with Terraform

I remodeled the entirety of the company infrastructure on Google Cloud platform to provide increased security, faster deployment of new infrastructure, data and infrastructure isolation, and fine-grained permissions. This was done using Terraform to write multiple reusable modules that were combined into modules of reusable segments of infrastructure, and finally combined into a full infrastructure deployments.

Disaster Recovery Plan

I helped model and make modifications to an existing company infrastructure in Amazon Web Services to follow best practices for highly available redundant systems to provably be able to sustain the infrastructure, applications, and data in the event of an infrastructure outage.

This involved data replication across AWS availability zones (AZ), and changes to infrastructure to shift the load across multiple AZs. Once the infrastructure changes and data replication were made, a full availability zone failure was simulated by terminating and stopping all resources in a single AZ. We provably showed that the system remained resilient, and no data loss occurred.

Orion Analytics Platform

https://github.com/gree/Orion
I built a visualization tool using PHP, HTML, and JavaScript for timeseries data stored in a Graphite data store.

This tool enabled the user to select multiple types of graphing capabilities including data breakdowns across metric subcategories and period lookbacks (e.g., week over week comparison).

Users were authenticated using one of the multiple methods (OAuth being the primary means). They could also create and save dashboards for use across the company, add external links, and drill down into an individual graph or dataset.

This tool was used to help diagnose and notify Apple of App Store payments outage in June 2012 prior to their knowledge outage. The Orion Analytics Platform was developed at Funzio, which was purchased by Gree International in May 2012, and was open sourced in Aug/Sept 2012.

Tools

GitHub, JSX, Terraform, Apache Airflow, Git, Sublime Text, Slack, MongoDB Atlas, Logging, Google Kubernetes Engine (GKE), AWS IAM

Languages

Python, SQL, HTML, CSS, Ruby, JavaScript, TypeScript, PHP

Libraries/APIs

jQuery, React, Amazon Rekognition, REST APIs

Paradigms

DevOps, ETL

Platforms

Google Cloud Platform (GCP), Amazon Web Services (AWS), Kubernetes, Docker, MacOS, Amazon Alexa, AWS Lambda, Heroku

Storage

PostgreSQL, JSON, Amazon S3 (AWS S3), Google Cloud Storage, Datadog, MongoDB, Database Administration (DBA), MySQL, Redis

Other

Data Engineering, Cloud Architecture, Google BigQuery, CSV, Automation Tools, Consul, Open Source, Disaster Recovery Plans (DRP), Analytics, APIs, Amazon API Gateway, Amazon Route 53, Google Pub/Sub

Frameworks

Django, Django REST Framework, Ruby on Rails (RoR), Redux

2009 - 2014

Bachelor's Degree in Software Engineering

University of Waterloo - Waterloo, Ontario, Canada

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring