Hassan Bin Zaheer, Developer in Melbourne, Australia
Hassan is available for hire
Hire Hassan

Hassan Bin Zaheer

Verified Expert  in Engineering

Bio

Hassan is a professionally qualified developer with over ten years of overall industry experience in software engineering, data architecture, data warehousing, ETL/ELT, feature engineering, database optimization, business intelligence, and consulting. He loves using different tools and technologies, including AWS, Snowflake, SQL databases, Python, Apache software, Docker, and GitLab. With his experience and determination, Hassan will be a great asset to any team.

Portfolio

ARQ Group
Amazon Web Services (AWS), Apache Airflow, AWS Glue, Apache NiFi...
Afiniti
Python, SQL, Greenplum, PostgreSQL, Docker, R, ETL, ELT, Data Architecture...
Freelance
Python, PostgreSQL, REST APIs, ETL, Data Warehousing, Data Marts, SQL...

Experience

  • Python - 10 years
  • SQL - 10 years
  • Data Engineering - 6 years
  • Amazon Web Services (AWS) - 3 years
  • Apache Airflow - 3 years
  • Data Build Tool (dbt) - 2 years
  • Snowflake - 2 years
  • Apache NiFi - 1 year

Availability

Part-time

Preferred Environment

Snowflake, Data Build Tool (dbt), Apache Airflow, Apache NiFi, Spark, Python, SQL, Data Engineering, Amazon Web Services (AWS), Amazon RDS

The most amazing...

...project I've built is a dbt-based data standardization and transformation platform that generates standardized data models taking JSON recipes as input.

Work Experience

Managing Consultant of Data Engineering

2022 - PRESENT
ARQ Group
  • Built scalable and highly performant data systems, pipelines and infrastructures from various raw data sources to deliver clear business insights.
  • Implemented and maintained data architectures built around automated ingestion, data security, compliance, and governance.
  • Designed batch and real-time analytical solutions and developed the prototype proofs of concept.
Technologies: Amazon Web Services (AWS), Apache Airflow, AWS Glue, Apache NiFi, AWS Step Functions, Business Intelligence (BI), Snowflake, Amazon DynamoDB, Redshift, Amazon S3 (AWS S3), AWS Lambda, AWS CloudFormation, Azure DevOps, Python, Data Warehousing, Data Warehouse Design, Query Optimization, Amazon RDS, Databricks, CI/CD Pipelines, Data Migration

Data Engineering Team Lead

2017 - 2022
Afiniti
  • Led, trained, and mentored a team of 18 data engineers and analysts through the design and development of data pipelines, data integration, and preparation of insights and visualizations.
  • Collaborated with analytics and data science teams to create predictive modeling strategies, feature engineer better data attributes, and build automation tools, causing efficiencies and process improvements.
  • Implemented and monitored data flows from disparate sources, such as APIs, databases, cloud, or files, and created snapshots of collected facts.
  • Maintained and evaluated data quality and provided consistent and correct data to internal teams and systems.
  • Set up test environments for evaluating and bench-marking new tools and technologies, developed a proof of concept, and wrote unit tests.
Technologies: Python, SQL, Greenplum, PostgreSQL, Docker, R, ETL, ELT, Data Architecture, Data Warehousing, Business Intelligence (BI), Database Optimization, Data Engineering, Amazon Web Services (AWS), Business Intelligence (BI) Platforms, Azure, Azure SQL Databases, Google Cloud Platform (GCP), ETL Tools, Apache Airflow, Data Management, AWS Glue, AWS Step Functions, Snowflake, Redshift, Data Warehouse Design, Query Optimization, MySQL, Amazon RDS

Data Software Engineer

2014 - 2017
Freelance
  • Extracted data from shipping providers, such as FedEx, UPS, and USPS, through RESTful APIs and loaded it into AWS for warehousing and reporting.
  • Optimized PostgreSQL database and functions, improving application performance by up to 50%.
  • Prepared and maintained customer relationship management (CRM) data in type 2 slowly changing dimension (SCD) form.
Technologies: Python, PostgreSQL, REST APIs, ETL, Data Warehousing, Data Marts, SQL, Database Optimization, Data Engineering, Amazon Web Services (AWS), ETL Tools, Data Management, AWS Glue, Data Warehouse Design, Query Optimization, MySQL, Amazon RDS, T-SQL (Transact-SQL)

Software Engineer

2012 - 2015
TRG Pvt Limited
  • Led a team of six software engineers through the development and implementation of Odoo in 50 textile factories across Pakistan, a project worth $2 million and sponsored by Chemonics International through USAID.
  • Developed requirement-specific software and tools, ranging from enterprise, web, and iOS mobile applications to open-source ERP modules and APIs.
  • Created and optimized reporting scripts in PostgreSQL.
  • Leveraged Selenium, Java, and Python to implement automated testing mechanisms, reducing post-production costs by 40%.
Technologies: Python, PostgreSQL, Business Intelligence (BI) Platforms, Query Optimization, MySQL

Autopie

This dbt-based application converts raw data from multiple industry domains into standardized models for easy use in descriptive and predictive analysis and reporting. The application takes column mappings from a UI, converts them into JSON recipes, and consumes it in dbt macros to generate standardized datasets.

Unified Data Store

A three-layer data warehouse based on Data Vault 2.0 model, with ingestion, consolidation, and consumption layers. The data is ingested into Snowflake from various sources (RDBMS, CSV and JSON Files, REST, and SOAP APIs) using AWS Glue. DBT is used for consolidation and consumption. AWS Lambda, Amazon CloudFront, DynamoDB, and S3 are some other AWS services used in the project. Apache Airflow is used for workflow orchestration. Azure DevOps is used for CI/CD.
2016 - 2016

Master's Degree in Engineering Management

University of Melbourne - Melbourne, Australia

2007 - 2011

Bachelor's Degree in Computer Science

Lahore University of Management Sciences - Lahore, Pakistan

NOVEMBER 2023 - NOVEMBER 2025

SnowPro Core Certification

Snowflake

MAY 2023 - PRESENT

Microsoft Certified: Azure Fundamentals

Microsoft

APRIL 2023 - PRESENT

AWS Certified Solutions Architect – Associate

Amazon Web Services

DECEMBER 2022 - DECEMBER 2024

Databricks Certified Data Engineer Associate

Databricks

DECEMBER 2022 - DECEMBER 2024

Databricks Certified Associate Developer for Apache Spark 3.0

Databricks

JULY 2022 - JULY 2025

AWS Certified Cloud Practitioner

Amazon Web Services

Libraries/APIs

REST APIs

Tools

Apache Airflow, AWS Glue, Apache NiFi, GitLab, AWS Step Functions, AWS CloudFormation, AWS SDK, Boto 3

Languages

Snowflake, Python, SQL, R, T-SQL (Transact-SQL)

Paradigms

ETL, Azure DevOps, Business Intelligence (BI), REST

Platforms

Amazon Web Services (AWS), Databricks, Docker, Azure, Google Cloud Platform (GCP), AWS Lambda

Storage

PostgreSQL, MySQL, Greenplum, JSON, Redshift, Azure SQL Databases, Amazon DynamoDB, Amazon S3 (AWS S3), RDBMS, Data Lakes

Frameworks

Spark, Jinja, Apache Spark

Other

Software Engineering, Data Build Tool (dbt), ELT, Data Warehousing, Database Optimization, Data Marts, Data Engineering, ETL Tools, Data Management, Query Optimization, Data Architecture, Business Intelligence (BI) Platforms, Data Warehouse Design, Amazon RDS, CI/CD Pipelines, Data Migration, SOAP, Data Vaults, Delta Lake, Microsoft Azure

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring