Fernando Ferrer, Data Engineer and Developer in Toronto, ON, Canada
Fernando Ferrer

Data Engineer and Developer in Toronto, ON, Canada

Member since June 18, 2020
Fernando is a data engineer and SQL expert with domain knowledge in finance and healthcare. He's created companies and data systems for some well-known entities such as the Olympics Committee and Queen's University. Fernando's work in the medical research field was pivotal in obtaining over $50 million in investment, and he also co-founded an innovative DaaS healthcare company that was later acquired by a multibillion-dollar conglomerate.
Fernando is now available for hire

Portfolio

  • NobleProg
    Amazon Web Services (AWS), AWS, PostgreSQL, Oracle, MongoDB, MySQL, PHP
  • Goodwater Capital
    Google Cloud Platform (GCP), BigQuery, Apache Airflow, Big Data, SQL
  • Immutable Data, Inc.
    Amazon Web Services (AWS), Snowflake, Relational Database Services (RDS)...

Experience

Location

Toronto, ON, Canada

Availability

Part-time

Preferred Environment

Amazon Aurora, MySQL, PostgreSQL, Linux

The most amazing...

...thing I've created was a healthcare analytics company from scratch that was sold to a multibillion-dollar conglomerate.

Employment

  • Corporate Training Director

    2017 - PRESENT
    NobleProg
    • Trained the Royal Canadian Navy how to do data analytics on AWS.
    • Coached a team of DBA and database developers working for the federal government of Mexico in PostgreSQL administration.
    • Instructed a team at Logitech Asia headquarters in testing and tinning PostgreSQL queries.
    • Developed the curriculum and syllabus for the following technical areas: PostgreSQL, database development and administration, AWS cloud computing, GCP cloud computing, and MariaDB administration.
    • Trained multiple clients in data visualization using PowerBI.
    Technologies: Amazon Web Services (AWS), AWS, PostgreSQL, Oracle, MongoDB, MySQL, PHP
  • Senior Data Architect

    2021 - 2022
    Goodwater Capital
    • Migrated environment to Google Cloud Platform (GCP) while managing two data engineers and one data scientist.
    • Moved over 20 TB of data to BigQuery. Set up ETL pipeline using Airflow. Tuned queries and designed BI architect system.
    • Managed a team of three data professionals. Migrated the entire environment to the cloud.
    Technologies: Google Cloud Platform (GCP), BigQuery, Apache Airflow, Big Data, SQL
  • Technical Director

    2017 - 2020
    Immutable Data, Inc.
    • Developed data warehouses for various clients, including ETL jobs, data lakes, and data pipelines. I designed and administered MySQL clusters as well as PostgreSQL instances.
    • Built a custom sports analytics platform for the US Olympic Committee using MySQL, PostgreSQL, Snowflake, and Python. Using DMS, I created an ETL pipeline and used Snowpipe to move raw data dumps into a virtualized Snowflake data lake.
    • Set up a data warehouse for a professional social media platform. I designed a pipeline that ingested data from MySQL and PostgreSQL and combined them into a data warehouse I created.
    • Performed DBA tasks on MySQL and PostgreSQL instances to ensure reliability and availability. It included troubleshooting slow queries, configuring buffers, provisioning hardware, and networking equipment.
    • Created ETL jobs using ETL works platform etlworks.com. We performed multiple transformations from raw dumps, PostgreSQL dumps, to Redshift.
    • Hired a team of seasoned data scientists to support client operations.
    • Developed multiple dashboards for multiple clients using PowerBI, Metabase, Tableau, and Looker.
    Technologies: Amazon Web Services (AWS), Snowflake, Relational Database Services (RDS), PostgreSQL, MySQL, Redshift, Python, AWS, SQL, Metabase, Data Visualization, Big Data, Google Cloud Platform (GCP), Google BigQuery, BigQuery
  • Senior Data Engineer

    2017 - 2017
    Androdon Capital, Inc.
    • Collected 20 years of currency trading data and inserted it into a PostgreSQL database.
    • Created a custom Python ETL to aggregate financial data into currency data.
    • Developed a high-frequency trading algorithm using C#.
    Technologies: Python, MySQL, PostgreSQL
  • Co-founder

    2013 - 2017
    Q2 Metrics
    • Created a physician influence tracking tool using over 1.2 billion records from both government and private companies. The transnational nature of the application was powered by a cluster of 5 MySQL servers.
    • Optimized MySQL cluster to ingest over 1 billion records and perform sub 1 second queries with over 1000 concurrent connections.
    • Managed a high-performing team of full-stack developers.
    • Negotiated the sale of the company to a multibillion-dollar conglomerate.
    Technologies: PostgreSQL, Oracle, MongoDB, MySQL, PHP
  • Clinical Applications Developer

    2008 - 2013
    Queen's University
    • Built the first online patient randomization system. It was a revolutionary way to enroll patients into blind randomized control trials. The system was powered by our on-premise My-SQL database cluster of 3 nodes.
    • Developed an open source longitudinal in-patient electronic data capture system in conjunction with Vanderbilt University.
    • Constructed a custom electronic data collection system for the largest nutritional-based randomized in-patient clinical trial study.
    • Performed SysAdmin tasks on a cluster of Unix servers where MySQL was installed and running. It included patching, troubleshooting slow queries and ensuring uptime.
    Technologies: Unix, PostgreSQL, MySQL, PHP
  • Oracle Developer

    2007 - 2008
    Canadian Cancer Institute of Canada
    • Developed Oracle Forms to collect and analyze clinical trial data.
    • Implemented a proprietary EDC system called Medidata Rave.
    • Migrated the past clinical trial data from a legacy system into a new system.
    Technologies: C#, Oracle

Experience

  • Physician Influence Tool
    http://www.q2metrics.com/pi

    A custom online tool that tracked how each physician in the US was doing in their field in comparison to their peers.
    The tool tracked:
    • Number of procedures performed
    • Amount of billables
    • Published articles
    • Shared patients
    • Social media feeds
    • Conference speakers
    The goal of the tool was to show to clients which physician was an influencer in order for them to pitch their products to them.

  • Data Warehouse and Pipeline for the US Olympic Committee

    The high-performance division of the US Olympic Committee asked my team and me to develop an internal tool to collect and analyze athletes' performance data and predict possible injuries.

    We collected wearable/sensor data plus questionnaire data and medical records and combined them into a data warehouse.
    The orchestration was done using Airflow, and the warehouse was built on Redshift.

  • High-frequency Trading Algorithm

    I developed a high-frequency trading algorithm for a boutique financial firm in Toronto. The algorithm would look at current pricing and trends of different currency pairs and would execute trades based on a predictive model.

  • Central Randomization System
    https://ceru.hpcvl.queensu.ca/CRS/

    Conducting a randomized control trial across multiple centers on multiple geographical areas presents a unique set of challenges. Traditionally sealed envelopes are used to randomized the next patient, when multiple centers are involved, that strategy can't guarantee a balanced set of arms, that's where the central randomization system comes in.
    This tool is an online-based tool that guarantees patients are enrolled in a blind, double-blind, or balanced arm study from anywhere in the world. It provides different access levels to allow for patient unblinding in case of a serious adverse event while maintaining the blindness of the study.

  • Google Cloud Badges
    https://google.qwiklabs.com/public_profiles/8a94e88a-e3bb-4578-9036-de3a9881d308

    Here are the Google Qwiklabs badges and quests that I've obtained thus far.
    • BigQuery Basics for Data Analysts (Earned Jun 15, 2020)
    • Baseline: Infrastructure (Earned May 15, 2020)
    • Google Cloud Essentials (Earned May 13, 2020)

Skills

  • Languages

    SQL, Python, PHP, Snowflake, C#
  • Tools

    BigQuery, Apache Airflow, Microsoft Power BI, Sisense
  • Paradigms

    ETL, Data Science, DevOps
  • Platforms

    Amazon Web Services (AWS), Jupyter Notebook, Linux, Oracle, Unix, Google Cloud Platform (GCP)
  • Storage

    Redshift, PostgreSQL, MySQL, Databases, NoSQL, Data Lakes, MongoDB, Google Cloud, Amazon Aurora, Data Lake Design
  • Industry Expertise

    Healthcare
  • Other

    AWS, Data Warehousing, Data Modeling, Data Engineering, Fintech, Finance, Education, Big Data, Data Visualization, Metabase, Data Warehouse Design, Data Analytics, Dashboards, Dashboard Development, Lecturing, Higher Education, Relational Database Services (RDS), Web Marketing, SEO Tools, Ads, Google BigQuery, dms
  • Libraries/APIs

    D3.js
  • Frameworks

    Ruby on Rails (RoR), Flask, Laravel, Hadoop

Education

  • Postgraduate Certificate in Digital Marketing
    2021 - 2022
    Power Business School - Madrid, Spain
  • Master of Business Administration (MBA) Degree in Business
    2020 - 2020
    PowerMBA - Madrid, Spain
  • Continuing Education Diploma in Big Data Analytics
    2015 - 2015
    MIT xPRO - Online
  • Continuing Education Diploma in Big Data Analytics
    2014 - 2014
    Caltech via Coursera - Online
  • Advanced Diploma (Graduated with Distinction) in Computer Programmer Analyst
    2006 - 2008
    St. Lawrence College - Kingston, Ontario, Canada
  • Bachelor of Science Degree in Computer Engineering
    2002 - 2006
    URBE University - Maracaibo, Venezuela

Certifications

  • AWS Associate Engineer
    NOVEMBER 2019 - NOVEMBER 2024
    AWS
  • Big Data Fundamentals
    JUNE 2015 - PRESENT
    IBM
  • MongoDB Certified DBA Associate
    FEBRUARY 2015 - PRESENT
    MongoDB University

To view more profiles

Join Toptal
Share it with others