Hasitha Ranawaka, Developer in Kandy, Central Province, Sri Lanka
Hasitha is available for hire
Hire Hasitha

Hasitha Ranawaka

Verified Expert  in Engineering

Data Engineer and Software Developer

Kandy, Central Province, Sri Lanka

Toptal member since September 5, 2022

Bio

Hasitha aspires to be a leader in architecting data-driven solutions for businesses across the globe. He is a multi-skilled engineer with expertise in data engineering, cloud-native development, and analytics. As a data engineer, he is experienced in architecting highly scalable data platforms. As a certified cloud architect, Hasitha is skilled in implementing state-of-the-art cloud solutions. As an engineer, he is proficient in the technicalities of different manufacturing processes.

Portfolio

evTerra Recycling, LLC
Microsoft Power BI, REST APIs, API Design, ClickUp...
Braven, Inc.
BigQuery, Data Engineering, Apache Airflow, Tableau, Data Warehousing...
Braven, Inc.
BigQuery, Data Engineering, Apache Airflow, Tableau, Amazon S3 (AWS S3)...

Experience

  • Amazon Web Services (AWS) - 5 years
  • SQL - 5 years
  • Data Warehousing - 5 years
  • Data Engineering - 5 years
  • Microsoft Power BI - 5 years
  • Python - 5 years
  • ETL Tools - 4 years
  • Data Lakes - 4 years

Availability

Full-time

Preferred Environment

Amazon Web Services (AWS), Python, Visual Studio Code (VS Code), SQL, Apache Airflow

The most amazing...

...thing I've built is a data lake solution deployed by integrating seven enterprise resource planning systems in seven countries to Oracle Cloud Infrastructure.

Work Experience

Microsoft Power BI Expert

2023 - PRESENT
evTerra Recycling, LLC
  • Developed and maintained Microsoft Power BI dashboards for the reporting needs of functional areas of procurement, inventory, sales, and manufacturing.
  • Implemented cost-effective, scalable data pipelines to extract and transform data from REST APIs, eliminating manual data preparation.
  • Collaborated with EPR developers to implement business requirements in the ERP, enabling better reporting and data use.
  • Initiated a project to migrate workloads from Power BI to Microsoft Fabric.
Technologies: Microsoft Power BI, REST APIs, API Design, ClickUp, Enterprise Resource Planning (ERP), Excel 365, Microsoft 365, Business Analysis, Microsoft Fabric, Workflow, API Integration, Azure, Reporting, Integration, Microsoft PowerPoint, Data Cleansing

Airflow and BigQuery Data Engineer

2023 - PRESENT
Braven, Inc.
  • Implemented Airflow ETL pipelines to support organization-wide analytics workloads.
  • Initiated and managed dashboard migration from Periscope to Tableau, reducing weeks of data preparation cleaning workloads.
  • Built core data models for OLAP workloads on BigQuery to populate Tableau dashboards.
  • Carried out maintenance of the Astronomer airflow environment to keep production workloads running with no interruptions.
  • Implemented a tailor-made disaster recovery system for the PostgreSQL database with a recovery point objective (RPO) of 30 minutes and a recovery time objective (RTO) of 15 minutes.
  • Enhanced the data reporting layer through better data visualization and optimized business logic implementation.
Technologies: BigQuery, Data Engineering, Apache Airflow, Tableau, Data Warehousing, Data Warehouse Design, Google Cloud Platform (GCP), ETL, Salesforce, Salesforce API, Enterprise Resource Planning (ERP), REST APIs, API Design, Data Modeling, Ruby, Data Migration, Business Process Automation, Workflow, API Integration, Reporting, Google BigQuery, Asana, Integration, Scraping, Data Cleansing

Data Engineer

2024 - 2024
Braven, Inc.
  • Contributed to the organization's goal of data maturity by building maintainable, meaningful data models and data pipelines to support them.
  • Enhanced the existing Airflow data pipelines to load data efficiently and implemented data quality checks when required.
  • Implemented Airflow automation to pull, clean, and load student registration data from AWS S3 to Salesforce, replacing the manual process and saving engineering hours weekly.
  • Improved the security of the Airflow environment by securing data connection credentials using the Airflow secrets backend.
  • Identified the technical debt of the data infrastructure and implemented the fixes to future-proof the data ecosystem.
Technologies: BigQuery, Data Engineering, Apache Airflow, Tableau, Amazon S3 (AWS S3), Amazon Web Services (AWS), Boto3, Salesforce, Salesforce API, Data Modeling, Data Migration, Workflow, Database Schema Design, Relational Database Design, Reporting, Google BigQuery, Asana, Integration, Scraping, Data Cleansing

AWS Solutions Architect Consultant

2023 - 2024
eSolution
  • Architected the cloud migration of an Oracle-based on-premises disaster recovery environment to AWS cloud, eliminating the risk of operational interruption due to data center shutdown.
  • Implemented Oracle Cloud Infrastructure (OCI) to Amazon RDS for PostgreSQL database migration using AWS Database Migration Service (DMS).
  • Consulted on implementing a custom high-availability RDS for Oracle database through logical database replication using AWS DMS.
Technologies: Amazon Web Services (AWS), Oracle, PostgreSQL, Oracle RDS, Amazon EC2, Oracle Cloud Infrastructure (OCI), AWS Database Migration Service (DMS), Data Migration, Business Process Automation, Integration, Data Cleansing

Specialist of Data Engineering and Data Science

2021 - 2023
Stretchline Holdings
  • Implemented a data lake solution with Oracle Cloud Infrastructure by integrating seven enterprise resource planning (ERP) systems in seven countries.
  • Developed highly-available and scalable ETLs to get data from the source ERP systems to the data lake using Oracle Data Integrator.
  • Automated and organizational financial reporting process by utilizing the data lake to gain effort reduction and improved accuracy.
  • Created insight generation dashboards for the management in functional areas of finance, sales, and manufacturing.
  • Collaborated with different stakeholders to bring a data-driven decision-making culture to the organization.
  • Initiated a big data infrastructure implementation to integrate data sources outside the traditional ERP ecosystem.
Technologies: Data Engineering, Data Science, ETL Tools, Data Lakes, Data Warehousing, SQL, Python, Oracle Data Integrator (ODI), Oracle Cloud, Automation, Data Pipelines, ELT, Dimensional Modeling, ETL, Business Intelligence (BI), JDBC, Databases, R Programming, Digital Manufacturing, Excel 365, Cron, Data Warehouse Design, Oracle Database, Data Analytics, Big Data, Git, Power Query, Applied Mathematics, Database Design, Data Analysis, Data Architecture, Data Visualization, Automated Data Flows, Reports, SQL Stored Procedures, Stored Procedure, OLAP, Schemas, Jupyter Notebook, JSON, Oracle, Database Administration (DBA), Relational Databases, BI Reporting, Reporting, Microsoft Excel, Statistical Modeling, Data Transformation, Dashboard Development, Star Schema, Data Processing, Data Processing Automation, CSV, Scripting, Crystal Reports, SSRS Reports, SQL Server Reporting Services (SSRS), Data Queries, BigQuery, Macros, Excel 2016, XML for Analysis (XMLA), Microsoft Power BI, Dashboards, BI Reports, Pipelines, Apache Spark, Query Optimization, Partitioning, Data Mining, Big Data Architecture, Oracle SQL Developer, Oracle SQL Data Modeler, Dashboard Design, Web Analytics, XML, Microsoft SQL Server, Web Scraping, NoSQL, ODBC, Enterprise Resource Planning (ERP), REST APIs, API Design, Microsoft Word, Excel Macros, DAX, Microsoft 365, Business Analysis, Financial Modeling, Data Migration, Workflow, Database Schema Design, Relational Database Design, Algorithms, Outlook, Microsoft PowerPoint, Data Cleansing

Excel and Macros Developer

2022 - 2022
ORBEAT MANAGEMENT CORP
  • Developed an automated workflow to create contracts based on a given input set.
  • Produced an app script code to read data from a Google Sheet and produce a contract document.
  • Implemented the solution in the client's environment.
Technologies: Microsoft Excel, Macros, Excel 2010, Excel 2016, Google Workspace, Google Apps Script, Automation, Google Sheets, APIs, REST APIs, Business Process Automation, Microsoft PowerPoint

Senior AI and ML Consultant

2021 - 2021
Averyx Group
  • Developed machine learning-based algorithmic trading strategies and portfolio management techniques.
  • Collaborated with the team to develop highly scalable machine learning solutions.
  • Maintained the coding library with version control system GIT.
Technologies: Amazon EC2, Machine Learning, Reinforcement Learning, Deep Reinforcement Learning, Git, Databases, Excel 365, APIs, Data Analytics, Data Analysis, Amazon Web Services (AWS), Schemas, Jupyter Notebook, JSON, Relational Databases, CSV, Scripting, Data Queries, Microsoft Excel, Macros, Excel 2016, Statistics, Forecasting, Pipelines, Amazon RDS, AWS Cloud Architecture, Data Mining, Microsoft SQL Server, Web Scraping, REST APIs, API Design, Microsoft 365, Trading, Microsoft PowerPoint

Quantitative Analyst

2020 - 2021
Cairnhill Capital Management
  • Developed algorithms to build company stock portfolios based on different levels of financial and statistical filters.
  • Automated the entire portfolio creation process cutting down lead time by two days.
  • Versioned and maintained the company's code library.
Technologies: MATLAB, SQL, Python, Automation, Cron, APIs, Dimensional Modeling, ETL, MySQL, Databases, R Programming, Excel 365, Data Warehouse Design, Data Analytics, Git, Business Intelligence (BI), Power Query, Applied Mathematics, Database Design, Data Analysis, Schemas, Jupyter Notebook, JSON, CSV, Scripting, Data Queries, Microsoft Excel, Macros, Forecasting, Data Mining, XML, Microsoft SQL Server, Microsoft 365, Microsoft PowerPoint

Executive Supplier Performance Management

2020 - 2021
Brandix Apparel Limited
  • Liaised to introduce a system that would eliminate material inspection to streamline the process and gain financial benefits.
  • Collaborated with teammates to establish an upstream risk evaluation process that forecasts quality and color failures before bulk materialization.
  • Automated supplier scorecards with Microsoft Excel VBA.
Technologies: Python, Microsoft Power BI, Excel 365, Excel VBA, Automation, Data Science, Data Analytics, Git, Business Intelligence (BI), Power Query, Applied Mathematics, Data Analysis, Data Visualization, Reports, CSV, Scripting, Microsoft Excel, Macros, Excel 2016, Dashboards, XML, SharePoint, ODBC, Microsoft Word, Excel Macros, Visual Basic, Visual Basic for Applications (VBA), Visual Basic 6 (VB6), DAX, Microsoft 365, Business Analysis, Microsoft PowerPoint

Experience

Project Cerebro | Data Lake and Data Warehouse Solution in Oracle Cloud

Stretchline Group aspires to be more data-driven and establish a data-driven culture in insight generation and decision-making. For this reason, it has triggered the need for a data repository to consolidate all data silos into one platform.

Project Cerebro was initiated to develop a data lake solution, which acts as the data repository for all data silos providing the capability to perform descriptive, predictive, and prescriptive analysis. I worked as a data engineering and data science specialist driving the data engineering and data science aspects. Initially, all the ERP systems of the company across the globe were integrated into Oracle Cloud. Once the data was available in the object storage, ETLs were developed to clean and structure it according to various business use cases and load it into the data warehouse. ETLs were developed using Oracle Data Integrator and made highly scalable and available. Once the relevant data points were in the data warehouse, several Power BI dashboards were created to support management decision-making. Also, utilizing the object storage data, many machine learning algorithms were implemented to optimize and enhance production systems.

"Core Datasets" | A Single Source Truth for All Organizational Analytic Needs on GCP

Braven is a nonprofit organization in the USA that helps students achieve economic stability by providing guidance to get started with their careers. The organization needs data to ensure the quality of its university programs and requires many operation dashboards to support student engagement.

As a part of the core datasets team, I collaborated in developing an industry-scale data warehouse on BigQuery, enabling access to data of the right quality to the entire organization. BigQuery acts as the central data repository and the single source of truth for all data needs. The core datasets are built to handle large-scale OLAP workloads in denormalized form. Utilizing these core data models, I created several Tableau dashboards to enable operational insights to the wider audience in the organizations.

Data Integration and Analytics with Being Unlimited on AWS

Being Unlimited has initiated new business avenues, which are expected to generate data as a part of the newly launched websites.

As a data architect, I've designed and implemented the data integration and analytics platform. The platform is based on AWS. It receives data from multiple websites' back ends via Secure File Transfer Protocol (SFTP) and Java database connectivity (JDBC) connections and stores raw data in Amazon S3 buckets. Then Amazon EMR cluster consumes the data to perform required analytics and pushes the analytics-ready data to Redshift using PySpark. Finally, the transformed data is used in AWS QuickSight to visualize and analyze various business intelligence needs.

Big Data Platform for Cryptocurrency Data Analytics on AWS

An Australian client wanted to analyze the MACD signals of 60 cryptocurrencies for eight million combinations of MACD inputs. MACD is an indicator used to make buy-or-sell decisions in cryptocurrency trading.

I developed a fully-automated data platform in AWS Cloud to ingest historical currency price actions, generate EMA and MACD signals for all eight million combinations of inputs, and thereby create backtesting reports to evaluate the performance of all the combinations. AWS Lambda was used to ingest the price action data into Amazon S3 buckets, and transformations were carried out using AWS EMR. All the data transformations were carried out with a serverless architecture to minimize the cost. The process was automated using AWS Lambda functions, AWS Step Functions, and Amazon EventBridge services.

Unstructured Data Analytics Using Generative AI

Conducting analytics on unstructured data has been a challenge for a long time. The advent of Generative AI and its contextual understanding of the underlying data open new avenues for unstructured data analytics. This project focused on developing a web application to generate structured data using unstructured data formats such as PDFs, text files, CSVs, etc. An early prototype is expected to launch in April 2025.

Education

2015 - 2020

Bachelor's Degree in Engineering

University of Moratuwa - Moratuwa, Sri Lanka

Certifications

JULY 2024 - JULY 2027

AWS Certified Data Engineer - Associate

Amazon Web Services

DECEMBER 2023 - DECEMBER 2026

AWS Certified Database - Specialty

Amazon Web Services

JUNE 2023 - DECEMBER 2024

Oracle Autonomous Database Cloud 2023 Certified Professional

Oracle

MAY 2023 - MAY 2026

AWS Certified Solutions Architect - Professional

Amazon Web Services

OCTOBER 2021 - OCTOBER 2023

Oracle Cloud Infrastructure 2021 Architect Associate

Oracle

SEPTEMBER 2021 - SEPTEMBER 2023

Oracle Cloud Infrastructure Foundations 2021 Associate

Oracle

JUNE 2021 - PRESENT

Introduction to Designing Data Lakes on AWS

Coursera

JANUARY 2018 - PRESENT

Certificate in Business Accounting (Cert BA)

Chartered Institute of Management Accountants (CIMA)

Skills

Libraries/APIs

PySpark, ODBC, REST APIs, JDBC, Salesforce API, Vue, Vue 3

Tools

Microsoft Power BI, AWS Glue, Amazon Athena, Power Query, Microsoft Excel, BigQuery, Apache Airflow, Excel 2010, Excel 2016, Oracle SQL Data Modeler, Microsoft Word, Microsoft PowerPoint, MATLAB, Cron, Amazon Elastic MapReduce (EMR), Spark SQL, Git, Amazon QuickSight, Crystal Reports, Google Sheets, Tableau, Asana, Google Workspace, pgAdmin, Oracle Exadata

Languages

SQL, Python, Excel VBA, Stored Procedure, Visual Basic, Visual Basic for Applications (VBA), Visual Basic 6 (VB6), Snowflake, XML, Ruby, Google Apps Script

Frameworks

Apache Spark, Spark, Flask

Paradigms

Business Intelligence (BI), ETL, Dimensional Modeling, Database Design, OLAP, Automation

Platforms

Oracle Data Integrator (ODI), AWS Lambda, Oracle Database, Amazon EC2, Amazon Web Services (AWS), Jupyter Notebook, Oracle, Microsoft Fabric, SharePoint, Azure, Google Cloud Platform (GCP), Salesforce, Visual Studio Code (VS Code), Oracle Cloud Infrastructure (OCI)

Storage

Data Lakes, Oracle Cloud, Databases, AWS Data Pipeline Service, Amazon S3 (AWS S3), Data Lake Design, Redshift, Data Pipelines, MySQL, SQL Stored Procedures, PostgreSQL, Relational Databases, SQL Server Reporting Services (SSRS), Oracle SQL Developer, Microsoft SQL Server, JSON, Database Administration (DBA), NoSQL, MongoDB, DBeaver, Amazon DynamoDB, Oracle RDS

Other

Data Engineering, Data Warehousing, ETL Tools, Excel 365, Data Warehouse Design, Data Analytics, Big Data, ELT, Data Analysis, Data Architecture, Data Visualization, Reports, Schemas, BI Reporting, Reporting, Statistical Modeling, Data Transformation, Dashboard Development, Star Schema, Data Processing, Data Processing Automation, CSV, Scripting, SSRS Reports, Data Queries, Google BigQuery, Macros, Dashboards, BI Reports, Pipelines, Data Build Tool (dbt), AWS Cloud Architecture, Big Data Architecture, Dashboard Design, Web Scraping, Data Modeling, AWS Certified Solution Architect, Solution Architecture, Enterprise Resource Planning (ERP), Excel Macros, DAX, Microsoft 365, Business Analysis, Data Migration, Business Process Automation, Workflow, Database Schema Design, Relational Database Design, API Integration, Integration, Outlook, Scraping, Data Cleansing, Digital Manufacturing, Applied Mathematics, Data Science, APIs, Machine Learning, Automated Data Flows, XML for Analysis (XMLA), Statistics, Forecasting, Amazon RDS, Query Optimization, Partitioning, Data Mining, Web Analytics, API Design, Megalodon, Trading, Financial Modeling, Algorithms, R Programming, Web Servers, SFTP, Management Accounting, Reinforcement Learning, Deep Reinforcement Learning, Oracle Data Guard, ClickUp, AWS Database Migration Service (DMS), Boto3, Amazon Kinesis, Amazon Redshift, Generative Artificial Intelligence (GenAI)

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring