Michael Kooloos, Developer in Valencia, Spain
Michael is available for hire
Hire Michael

Michael Kooloos

Verified Expert  in Engineering

Data Engineer and Software Developer

Location
Valencia, Spain
Toptal Member Since
October 30, 2020

Michael is an all-around big data, BI, and DWH analytics engineer and architect with over 20 years of experience. He has a quick understanding of the information needs of the business. Michael is experienced in all project phases, from analysis and design to development and end-user training and support. Michael is best in complex environments when technology needs to result in real business value by delivering the right information to the right person at the right time.

Portfolio

Freelance
ETL, PL/SQL, Data Warehousing, Analytics, Data Analysis, Data Pipelines...
BiiT Sourcing Solutions
Amazon Web Services (AWS), Analytics, Data Analysis, Data, Activiti BPM, Agile...
Freelance
Data Architecture, Data Lakes, Data Analysis, MySQL, PostgreSQL, Data Pipelines...

Experience

Availability

Part-time

Preferred Environment

Agile, SQL, Delta Lake, Data Build Tool (dbt), DataHub, Data Analytics, Microservices

The most amazing...

...real-life problem I've solved is a module for predicting and calculating forecast figures for the maintenance of escalators in a big international airport.

Work Experience

Data Analytics Engineer | Architect

2015 - PRESENT
Freelance
  • Developed a data warehouse using metadata-driven data vault code automation.
  • Analyzed requirements for a new corporate dashboard, analyzed data, and created a solution design.
  • Developed a data pipeline for monitoring machine logs to detect possible bugs after software upgrades.
  • Analyzed and tuned several performance issues, bringing query runtimes from minutes to milliseconds.
  • Developed a point-in-time detection of nonconformity of machine constants.
  • Led a team of data warehouse developers to develop and implement a mortgage data warehouse.
  • Architected a unified product catalog and several pipelines for the synchronization of vendor-specific attributes.
  • Implemented a way of working and created guidelines for conceptional data modeling as part of the overall data governance strategy.
  • Enabled a way of working and creating guidelines for metadata management as part of the overall data governance strategy.
  • Developed a real-time search and analytics platform based on Apache Solr for metal traders and exchanges.
Technologies: ETL, PL/SQL, Data Warehousing, Analytics, Data Analysis, Data Pipelines, Performance Analysis, Oracle RDBMS, Dashboard Design, Business Process Analysis, Agile, Scrum, Data Warehouse Design, Data Engineering, Data Vaults, Netezza, Microsoft Power BI, Data Cleaning, Data Aggregation, Data Science, Large Data Sets, Snowflake, SQL Server Integration Services (SSIS), Azure Data Factory, Data Modeling, Amazon S3 (AWS S3), Data Reporting, Data Analytics, Python, Google Data Studio, Apache Airflow, Azure, Master Data Management (MDM), Business Requirements, Data Management, Key Performance Indicators (KPIs), CI/CD Pipelines, Database Architecture, Azure Databricks, Data Manipulation, Dashboards, Amazon EMR Studio, AWS Glue, ELT, Warehouses, Kubernetes, Real-time Data, Looker, Data Build Tool (dbt), English, Query Optimization, Database Migration, Cloud, Data Integration, Data Structures, Advisory, Consulting, Data Processing, ETL Tools, Google BigQuery, Google Cloud Platform (GCP), BigQuery

Solutions Architect and Co-founder

2013 - 2015
BiiT Sourcing Solutions
  • Led a team of application developers to develop and implement several cloud-based applications.
  • Organized work to meet goals and deadlines while ensuring the technical solution supported the business objectives.
  • Designed solutions for our clients on numerous projects, using existing and proven open-source software, components, frameworks, and our products to deliver custom cloud-based solutions.
  • Enabled digital transformation of organizations through process optimization and real-time information access to deliver the right information to the right person at the right time.
  • Implemented proof of technology for IoT beacon processing for indoor positioning to deliver smart shopping presentations.
Technologies: Amazon Web Services (AWS), Analytics, Data Analysis, Data, Activiti BPM, Agile, Business Process Modeling, SWOT Analysis, Requirements Analysis, Kanban, Scrum, Data Reporting, Business Requirements, Key Performance Indicators (KPIs), CI/CD Pipelines, NoSQL, English, Cloud, APIs, Data Processing, Technical Leadership

Business Intelligence and Data Warehouse Consultant

2010 - 2013
Freelance
  • Designed and developed an open-source enterprise information access platform based on Hadoop (distributed file system and processing), Apache Solr (search engine), and Apache Mahout (machine learning/predictive analytics).
  • Built a back-end rule engine for individual training schedule creation.
  • Created web-based applications for registering, visualizing, and analyzing individual athlete and team progress for performance, recovery logs, and test results.
  • Developed a data warehouse for multiple hospitals for patients, medical cases, transfers, diagnoses, and observations.
Technologies: Data Architecture, Data Lakes, Data Analysis, MySQL, PostgreSQL, Data Pipelines, Analytics, Big Data Architecture, Big Data, Activiti BPM, MongoDB, Elasticsearch, Apache Solr, Hadoop, R, Excel VBA, Warehouses, Spark, English, Query Optimization, Database Migration, Business Intelligence (BI), Data Analytics, Data Integration, Data Structures, Advisory, Consulting, Data Processing, ETL Tools, Large-scale Projects, Dashboard Development

Business Intelligence and Data Warehouse Consultant

2002 - 2010
Scamander Solutions
  • Created a data warehouse for alerts/violations derived from city control.
  • Designed and developed several semantic layers for end-user presentations for both analytical and operational reporting purposes.
  • Built dashboards, custom visualization widgets, and analytical and operational reports for finance, sales, marketing, HR, healthcare, manufacturing, and supply chain.
  • Developed a set of triggers and PL/SQL packages to predict and calculate forecast figures for the maintenance of escalators in a big international airport.
  • Led a project to implement a data warehouse solution for a big telco provider and the interfacing of product and invoice data to the customer web portal.
Technologies: Data Architecture, PL/SQL, Databases, Analytics, Data Warehousing, Data Warehouse Design, Business Intelligence (BI), Data, Data Modeling, Oracle PL/SQL, SQL, D3.js, Tableau, Oracle Data Integrator (ODI), Oracle BI, Talend, Oracle RDBMS, Data Cleaning, Data Aggregation, Large Data Sets, ETL, Data Reporting, Data Analytics, Business Requirements, Key Performance Indicators (KPIs), Database Architecture, Data Manipulation, Dashboards, Warehouses, English, Query Optimization, Database Migration, APIs, Data Integration, Data Structures, Advisory, Consulting, Data Processing, ETL Tools, Large-scale Projects, Technical Leadership, Oracle EBS, Oracle, Dashboard Development

Oracle Consultant

1998 - 2002
Getronics Software Solutions
  • Developed customizations and gave third-line technical support in an Oracle Manufacturing environment.
  • Analyzed, designed, and developed a custom Oracle system for mailing and registration to support a high output of forms.
  • Developed and gave support to a custom Oracle system for order management.
Technologies: Databases, PL/SQL, Performance Analysis, Oracle PL/SQL, Oracle E-Business Suite (EBS), Oracle RDBMS, Linux, English, Query Optimization, APIs, Data Integration, Data Structures, Data Processing, Oracle EBS, Oracle

Prediction and Calculation of Forecast Figures for The Maintenance on Escalators

To avoid maintenance during rush hours and prevent accidents, I developed a module for predicting and calculating forecast figures for the maintenance of escalators in a big international airport. The analysis showed that changes in energy consumption of certain parts could indicate technical problems resulting in (future) breakdowns.

Performance Analysis and Tuning for Machine Logs

Performance analysis and tuning of non-conformances detection of machine constants stored in a Netezza data warehouse appliance for semiconductor machine logs.

I analyzed and tuned several performance issues to store and retrieve the data, bringing query runtimes back from minutes to milliseconds. I redeveloped the data pipeline so it would only store changes in machine constants (constants only change in less than 1% of the cases) instead of always storing all the machine constants and the complete machine log. I further designed and developed a point-in-time detection for the monitoring of non-conformances.

Digital Signage - Event and Sensor Data Processing and Data Visualization

Implementation of proof of technology for IoT beacon processing for indoor positioning to deliver smart shopping presentations to customers.

I designed a solution using existing and proven open-source software, components, and frameworks, developed a data pipeline to store and process IoT beacon data, and used device detection for indoor positioning and proximity engagement. Furthermore, I used digital signage to deliver smart and interactive advertisements. For hot zone and cold zone detection, I developed heatmap visualizations so store flow could be visualized.

Multiple Dashboards, Analytical and Operational Reports

To deliver the right information to the right person at the right, I designed and developed multiple dashboards, custom visualization widgets, and both analytical and operational reports for finance, sales, marketing, HR, healthcare, manufacturing, and supply chain.
1995 - 1998

Bachelor of Science Degree in Information Technology

The Hague University of Applied Sciences - The Hague, The Netherlands

MARCH 2023 - PRESENT

Databricks Lakehouse Fundamentals

Databricks Academy

JUNE 2020 - PRESENT

Containers and Microservices

IBM Big Data University

JUNE 2020 - PRESENT

Applied Data Science with Python

IBM Big Data University

SEPTEMBER 2017 - PRESENT

Spark, Hadoop Data Access and Big Data Analytics

IBM Big Data University

MARCH 2015 - PRESENT

Professional Scrum Master (PSM I)

Scrum.org

MAY 2007 - PRESENT

Certified Business Intelligence Professional (CBIP)

TDWI.org

APRIL 2007 - PRESENT

Oracle Business Intelligence (OBIEE)

Oracle

OCTOBER 2000 - PRESENT

Oracle Designer for Conceptual Design

Oracle

Libraries/APIs

D3.js

Tools

Oracle E-Business Suite (EBS), Apache Airflow, Apache Solr, Tableau, Microsoft Power BI, DataHub, AWS Glue, Looker, BigQuery

Languages

SQL, Python, Snowflake, R, Excel VBA

Storage

Oracle RDBMS, Oracle PL/SQL, Databases, PL/SQL, Netezza, PostgreSQL, Data Pipelines, MySQL, Data Lakes, Database Architecture, Database Migration, Data Integration, Elasticsearch, MongoDB, SQL Server Integration Services (SSIS), Amazon S3 (AWS S3), Master Data Management (MDM), NoSQL

Paradigms

Requirements Analysis, Business Intelligence (BI), ETL, Scrum, Agile, Kanban, Data Science, Microservices

Platforms

Talend, Amazon Web Services (AWS), Oracle, Docker, Linux, Oracle Data Integrator (ODI), Databricks, Azure, Kubernetes, Google Cloud Platform (GCP)

Frameworks

Hadoop, Activiti BPM, Spark

Other

Data Engineering, Data Visualization, Business Process Analysis, Data Warehouse Design, Dashboard Design, Data Modeling, Performance Analysis, Oracle BI, Data, Data Warehousing, Analytics, Data Analysis, Data Architecture, Data Cleaning, Data Aggregation, Data Reporting, Data Analytics, Key Performance Indicators (KPIs), ELT, Warehouses, English, Query Optimization, Data Vaults, Business Process Modeling, Large Data Sets, Business Requirements, Data Management, Data Manipulation, Dashboards, Cloud, APIs, Data Structures, Advisory, Consulting, Data Processing, ETL Tools, Large-scale Projects, Technical Leadership, Oracle EBS, Dashboard Development, SWOT Analysis, Conceptual Design, Big Data, Big Data Architecture, Startups, Azure Data Factory, Delta Lake, Data Build Tool (dbt), Google Data Studio, CI/CD Pipelines, Azure Databricks, Amazon EMR Studio, Real-time Data, Google BigQuery

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring