Fernando Ferrer, Developer in Toronto, ON, Canada

Fernando Ferrer

Data Engineer and Developer

Location
Toronto, ON, Canada
Toptal Member Since
June 18, 2020

Fernando is a data engineer and SQL expert with domain knowledge in finance and healthcare. He's created companies and data systems for some well-known entities such as the Olympics Committee and Queen's University. Fernando's work in the medical research field was pivotal in obtaining over $50 million in investment, and he also co-founded an innovative DaaS healthcare company that was later acquired by a multibillion-dollar conglomerate.

Fernando is available for hire
Hire Fernando

Portfolio

Hendrick Autoguard
MongoDB, PyMongo, SQL
Pfizer - PGS Operations Insights
SQL, Data Engineering, Python, ETL, Amazon Web Services (AWS), Python 3
CONTINUUM HEALTH VENTURES
Data Analytics, Big Data, Data Lakes, Amazon Web Services (AWS)

Location

Toronto, ON, Canada

Availability

Full-time

Preferred Environment

Amazon Aurora, MySQL, PostgreSQL, Linux

The most amazing...

...thing I've created was a healthcare analytics company from scratch that was sold to a multibillion-dollar conglomerate.

Work Experience

2023 - 2023

MongoDB Architect

Hendrick Autoguard
  • Advised a team of engineers on designing and implementing a data lake for centralized data repositories.
  • Created a NoSQL data model to integrate multiple data sources.
  • Provided advice around best practices when dealing with data.
Technologies: MongoDB, PyMongo, SQL
2022 - 2023

Data Engineer

Pfizer - PGS Operations Insights
  • Developed a data warehouse and data pipelines to ingest logistic and pharmaceutical data.
  • Reduced ETL pipeline complexity by re-designing the data model and process.
  • Improved data quality by adding QA checks to the pipeline to ensure consistency and accuracy.
Technologies: SQL, Data Engineering, Python, ETL, Amazon Web Services (AWS), Python 3
2021 - 2023

Data Advisor

CONTINUUM HEALTH VENTURES
  • Automated due diligence processes to reduce cost. Applied NLP on financial and pitch documents for context extraction.
  • Completed due diligence using data analytics tools on benchmarking against the universe of potential investments.
  • Integrated multiple data feeds into our data lake to potential benchmark investments.
Technologies: Data Analytics, Big Data, Data Lakes, Amazon Web Services (AWS)
2021 - 2022

Senior Data Architect

Goodwater Capital
  • Migrated environment to Google Cloud Platform (GCP) while managing two data engineers and one data scientist.
  • Moved over 20 TB of data to BigQuery. Set up ETL pipeline using Airflow. Tuned queries and designed BI architect system.
  • Managed a team of three data professionals. Migrated the entire environment to the cloud.
Technologies: Google Cloud Platform (GCP), BigQuery, Apache Airflow, Big Data, SQL, ETL, Python 3
2017 - 2020

Corporate Training Director

NobleProg
  • Trained the Royal Canadian Navy on how to do data analytics on AWS.
  • Coached a team of DBA and database developers working for the federal government of Mexico in PostgreSQL administration.
  • Instructed a team at Logitech Asia headquarters in testing and tinning PostgreSQL queries.
  • Developed the curriculum and syllabus for the following technical areas: PostgreSQL, database development and administration, AWS cloud computing, GCP cloud computing, and MariaDB administration.
  • Trained multiple clients in data visualization using PowerBI.
Technologies: Amazon Web Services (AWS), PostgreSQL, Oracle, MongoDB, MySQL, PHP, SQL
2017 - 2020

Technical Director

Immutable Data, Inc.
  • Developed data warehouses for various clients, including ETL jobs, data lakes, and data pipelines. I designed and administered MySQL clusters as well as PostgreSQL instances.
  • Built a custom sports analytics platform for the US Olympic Committee using MySQL, PostgreSQL, Snowflake, and Python. Using DMS, I created an ETL pipeline and used Snowpipe to move raw data dumps into a virtualized Snowflake data lake.
  • Set up a data warehouse for a professional social media platform. I designed a pipeline that ingested data from MySQL and PostgreSQL and combined them into a data warehouse I created.
  • Performed DBA tasks on MySQL and PostgreSQL instances to ensure reliability and availability. It included troubleshooting slow queries, configuring buffers, provisioning hardware, and networking equipment.
  • Created ETL jobs using ETL works platform etlworks.com. We performed multiple transformations from raw dumps, PostgreSQL dumps, to Redshift.
  • Hired a team of seasoned data scientists to support client operations.
  • Developed multiple dashboards for multiple clients using PowerBI, Metabase, Tableau, and Looker.
Technologies: Amazon Web Services (AWS), Snowflake, Relational Database Services (RDS), PostgreSQL, MySQL, Redshift, Python, SQL, Metabase, Data Visualization, Big Data, Google Cloud Platform (GCP), Google BigQuery, BigQuery, ETL
2017 - 2017

Senior Data Engineer

Androdon Capital, Inc.
  • Collected 20 years of currency trading data and inserted it into a PostgreSQL database.
  • Created a custom Python ETL to aggregate financial data into currency data.
  • Developed a high-frequency trading algorithm using C#.
Technologies: Python, MySQL, PostgreSQL, SQL, ETL
2013 - 2017

Co-founder

Q2 Metrics
  • Created a physician influence tracking tool using over 1.2 billion records from both government and private companies. The transnational nature of the application was powered by a cluster of five MySQL servers.
  • Optimized MySQL cluster to ingest over 1 billion records and perform sub-1-second queries with over 1000 concurrent connections.
  • Managed a high-performing team of full-stack developers.
  • Negotiated the sale of the company to a multibillion-dollar conglomerate.
Technologies: PostgreSQL, Oracle, MongoDB, MySQL, PHP, SQL, ETL, Amazon Web Services (AWS)
2008 - 2013

Clinical Applications Developer

Queen's University
  • Built the first online patient randomization system. It was a revolutionary way to enroll patients into blind randomized control trials. The system was powered by our on-premise My-SQL database cluster of three nodes.
  • Developed an open source longitudinal in-patient electronic data capture system in conjunction with Vanderbilt University.
  • Constructed a custom electronic data collection system for the largest nutritional-based randomized in-patient clinical trial study.
  • Performed SysAdmin tasks on a cluster of Unix servers where MySQL was installed and running. It included patching, troubleshooting slow queries, and ensuring uptime.
Technologies: Unix, PostgreSQL, MySQL, PHP, SQL, ETL
2007 - 2008

Oracle Developer

Canadian Cancer Institute of Canada
  • Developed Oracle Forms to collect and analyze clinical trial data.
  • Implemented a proprietary EDC system called Medidata Rave.
  • Migrated the past clinical trial data from a legacy system into a new system.
Technologies: C#, Oracle

Experience

Physician Influence Tool

http://www.q2metrics.com/pi
A custom online tool that tracked how each physician in the US was doing in their field in comparison to their peers.
The tool tracked:
• Number of procedures performed
• Amount of billables
• Published articles
• Shared patients
• Social media feeds
• Conference speakers
The goal of the tool was to show to clients which physician was an influencer in order for them to pitch their products to them.

Data Warehouse and Pipeline for the US Olympic Committee

The high-performance division of the US Olympic Committee asked my team and me to develop an internal tool to collect and analyze athletes' performance data and predict possible injuries.

We collected wearable/sensor data plus questionnaire data and medical records and combined them into a data warehouse.
The orchestration was done using Airflow, and the warehouse was built on Redshift.

High-frequency Trading Algorithm

I developed a high-frequency trading algorithm for a boutique financial firm in Toronto. The algorithm would look at current pricing and trends of different currency pairs and would execute trades based on a predictive model.

Central Randomization System

https://ceru.hpcvl.queensu.ca/CRS/
Conducting a randomized control trial across multiple centers in multiple geographical areas presents a unique set of challenges. Traditionally sealed envelopes are used to randomize the next patient. When multiple centers are involved, that strategy can't guarantee a balanced set of arms.
That's where the central randomization system comes in.

This online tool guarantees patients are enrolled in a blind, double-blind, or balanced arm study from anywhere in the world. It provides different access levels to allow for patient unblinding in case of a serious adverse event while maintaining the blindness of the study.

Google Cloud Badges

https://google.qwiklabs.com/public_profiles/8a94e88a-e3bb-4578-9036-de3a9881d308
Here are the Google Qwiklabs badges and quests that I've obtained thus far.
• BigQuery Basics for Data Analysts (Earned Jun 15, 2020)
• Baseline: Infrastructure (Earned May 15, 2020)
• Google Cloud Essentials (Earned May 13, 2020)

Skills

Languages

SQL, Python, Python 3, PHP, Snowflake, C#, R

Tools

BigQuery, Apache Airflow, Microsoft Power BI, Sisense

Paradigms

ETL, Data Science, DevOps, Agile

Platforms

Amazon Web Services (AWS), Jupyter Notebook, Linux, Oracle, Unix, Google Cloud Platform (GCP)

Storage

Redshift, PostgreSQL, MySQL, Databases, NoSQL, Data Lakes, MongoDB, Google Cloud, Amazon Aurora, Data Lake Design, Google Cloud SQL

Industry Expertise

Healthcare

Other

Data Warehousing, Data Modeling, Data Engineering, Fintech, Finance, Education, Big Data, Data Visualization, Metabase, Data Warehouse Design, Data Analytics, Dashboards, Dashboard Development, Lecturing, Higher Education, Relational Database Services (RDS), Web Marketing, SEO Tools, Ads, Google BigQuery, Business, Business Models, Software Development, Software Deployment, Data, Data Architecture

Libraries/APIs

D3.js, PyMongo

Frameworks

Ruby on Rails (RoR), Flask, Laravel, Hadoop

Education

2021 - 2022

Postgraduate Certificate in Digital Marketing

Power Business School - Madrid, Spain

2020 - 2020

Master of Business Administration (MBA) Degree in Business

PowerMBA - Madrid, Spain

2015 - 2015

Continuing Education Diploma in Big Data Analytics

MIT xPRO - Online

2014 - 2014

Continuing Education Diploma in Big Data Analytics

Caltech via Coursera - Online

2006 - 2008

Advanced Diploma (Graduated with Distinction) in Computer Programmer Analyst

St. Lawrence College - Kingston, Ontario, Canada

2002 - 2006

Bachelor of Science Degree in Computer Engineering

URBE University - Maracaibo, Venezuela

Certifications

MARCH 2023 - PRESENT

Palantir Foundry Foundations

Palantir

NOVEMBER 2019 - NOVEMBER 2024

AWS Associate Engineer

AWS

JUNE 2015 - PRESENT

Big Data Fundamentals

IBM

FEBRUARY 2015 - PRESENT

MongoDB Certified DBA Associate

MongoDB University