Róbert Schmidt, Developer in Trnava, Slovakia
Róbert is available for hire
Hire Róbert

Róbert Schmidt

Verified Expert  in Engineering

Data Engineer and Database Developer

Location
Trnava, Slovakia
Toptal Member Since
December 7, 2021

Róbert is a professional focused on data engineering and business intelligence using Azure Cloud Services and Microsoft on-premises solutions. His experience spans database development, traditional BI approaches, big data, and streaming solutions. He has developed big data solutions, collected terabytes of data, and provided a baseline for strategical decision-making, mainly with Apache Spark and Azure Databricks. He has worked with companies such as EY, Swiss Re, and Endeavor.

Portfolio

Sodexo
Azure, Azure Databricks, Azure Data Factory, Azure DevOps, Azure SQL Databases...
Swiss Re
Azure, Azure SQL Databases, Databricks, Azure Data Factory, Azure DevOps...
Rare Crew
Azure, Azure SQL Data Warehouse, Dedicated SQL Pool (formerly SQL DW)...

Experience

Availability

Part-time

Preferred Environment

Azure, Databricks, Azure SQL Databases, Azure Data Factory, Azure DevOps, Azure Data Lake, Delta Lake, Microsoft SQL Server, Data Engineering, Big Data

The most amazing...

...tool I've provided is a highly scalable analytical platform consuming terabytes of data and using statistical methods in the online application.

Work Experience

Lead Azure Data Engineer

2022 - PRESENT
Sodexo
  • Led a team of eight developers in various data engineering features. I was involved in building the architecture of a new global data platform for the company.
  • Owned two legacy data platforms. The first platform was the traditional data warehouse model, and the second one was a unified data lake framework for sharing data across the company.
  • Enabled the new data platform to provide an end-to-end platform for any kind of analytical project in the company, following all security best practices and regulations. It is supposed to be used for more than 100 projects in two years.
  • Implemented observability features to monitor and alert every step of data movement and semantic layer for easy access to any kind of data for users with minimal knowledge of data engineering.
Technologies: Azure, Azure Databricks, Azure Data Factory, Azure DevOps, Azure SQL Databases, Dremio

Azure Data Engineer

2020 - 2022
Swiss Re
  • Led a team of four data engineers to design and develop all data-related stories in the project.
  • Built a massive data processing solution with Azure Databricks for several use cases and statistical models. These scenarios could consume terabytes of data, and they were still highly scalable, even for smaller datasets, in an efficient way.
  • Designed and developed multiple ETL pipelines and data flows in Azure Data Factory. These pipelines were also integrated directly with Delta Lake and decreased the complexity of integration with RMDB databases and other sources.
  • Designed the database model, optimized T-SQL queries, and provided maintenance and performance improvements. The front-end app used the DB and needed a latency of up to 500 ms of loading time during big workloads on the back end.
Technologies: Azure, Azure SQL Databases, Databricks, Azure Data Factory, Azure DevOps, Delta Lake, Apache Spark, Python, Spark, ETL, Git, Azure SQL, Scrum

Azure Data Engineer

2018 - 2020
Rare Crew
  • Designed and developed solutions for collecting data from different types of sources like APIs, relational and non-relational databases with Azure Databricks, Azure Functions, and Azure Data Factory.
  • Orchestrated ETL pipelines with Azure Data Factory and Azure Event Grid with different scenarios. We used event-based triggers and pushed data forward to integrated services in near-real time, such as Adobe Experience Cloud, for marketing purposes.
  • Designed and implemented data warehouse in Azure SQL Data Warehouse, now known as Azure Synapse. The data warehouse was optimized for the fast loading of big datasets into our SSAS service with PolyBase.
Technologies: Azure, Dedicated SQL Pool (formerly SQL DW), Azure SQL Data Warehouse, Databricks, Azure Data Factory, SQL Server Analysis Services (SSAS), Azure Functions, Azure Data Lake, Apache Spark, Python, Spark, ETL, Git, Azure SQL, Scrum

ETL Developer

2017 - 2018
Accenture
  • Implemented ETL processes in SSIS and T-SQL from different sources such as Microsoft SQL, Excel, or CSV files. The solution decreased loading time and the repeatable manual workload of several employees.
  • Improved reports and visualizations according to provided requirements in Power BI to bring a better understanding of data to upper management.
  • Analyzed the previous implementation of data quality solutions and reporting services. Provided feedback and suggested some possibilities that were implemented afterward.
Technologies: T-SQL (Transact-SQL), Microsoft SQL Server, SQL Server Integration Services (SSIS), Microsoft Power BI, ETL, Git, Scrum

Software Engineer

2016 - 2017
Freelance
  • Developed an Android mobile application with a photo focus-stacking algorithm. It was an experimental project for new optical components integrated with mobile phones. The prototype of the solution was delivered and enhanced by the company.
  • Developed a JSF web application for a civil engineering company with up to 50 people. We developed a basic time management system with cloud storage integrated into the application. It helped to share data efficiently and keep them secure.
  • Built these projects with a team of three people. I designed the database and storage architecture.
Technologies: SQL, Java, Git

SQL Developer

2015 - 2015
EY
  • Delivered a new generation of auditing software for dozens of financial institutions in Central and Eastern Europe. The latest version of the application decreased average processing time by 90–95%.
  • Moved most of the exhaustive data logic to the database layer of Microsoft SQL. It reduced processing time, highly improved user experience, and brought new clients to our customer.
  • Provided consulting to customers and gathered their additional requirements needed in our product. According to the outputs from meetings, we developed customized ETL pipelines for ingesting and cleaning data.
Technologies: T-SQL (Transact-SQL), SQL Server Integration Services (SSIS), Microsoft SQL Server, ETL, Git

Data Warehouse for a Streaming Platform

https://www.endeavorstreaming.com/
I served as the main developer to provide an analytical data warehouse for one of the biggest entertainment companies in the world and their streaming platform. Streaming platforms are used for keeping a lot of information about customers and their behavior while using them.

The solution needed to process a massive amount of data and be well scalable because of significant peak hours as tens of thousands of users used the platform at the same time, and there was significant growth of users expected. We loaded, cleaned, and processed these data into a dimensional model with Azure Databricks and data were used for KPI dashboards and further analytics with Azure Analytics Services. I integrated several third-party services with the solution, and integration with Adobe Cloud Experience services received the required information in near real-time using Azure Functions and Event Grid. It provides an opportunity to build marketing strategies based on the actual interaction with the platform.

Enterprise Value Targeting

I was in charge of loading and processing KPIs and financial information about hundreds well known enterprises to create a fact-based assessment and benchmark for analytical purposes in Accenture. I loaded data from internal databases and external spreadsheets with ETL jobs using SSIS and transformed with T-SQL in Database. The outputs were delivered as dashboards in PowerBI to the upper management of the clients and the data model was used for further analytical purposes.
2014 - 2017

Master's Degree in Information Systems and Technologies

University of Economics - Prague, Czech Republic

2016 - 2016

Exchange Program in Information Technologies

The Hong Kong University of Science and Technology - Hong Kong, China

2011 - 2014

Bachelor's Degree in Applied Informatics

Slovak University of Technology - Bratislava, Slovakia

SEPTEMBER 2021 - SEPTEMBER 2023

Databricks Certified Associate Developer for Apache Spark 3.0

Databricks

MARCH 2021 - MARCH 2023

Microsoft Certified: Azure Fundamentals

Microsoft

MARCH 2021 - MARCH 2023

Microsoft Azure Data Engineer Associate

Microsoft

JANUARY 2021 - PRESENT

MCSA: SQL 2016 Database Development

Microsoft

MARCH 2018 - MARCH 2025

Personnel Security Clearance Certificate | Classification level: Secret

National Security Authority

Tools

Git, Microsoft Power BI

Paradigms

ETL, Azure DevOps, Business Intelligence (BI), Scrum

Languages

T-SQL (Transact-SQL), SQL, Python, Java

Storage

Azure SQL Databases, Microsoft SQL Server, Databases, Azure SQL, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS)

Platforms

Databricks, Azure, Azure SQL Data Warehouse, Azure Functions, Dedicated SQL Pool (formerly SQL DW)

Frameworks

Apache Spark, Spark

Other

Azure Data Factory, Data Warehousing, Data Engineering, Azure Data Lake, Delta Lake, Data Visualization, Big Data, Azure Databricks, Dremio

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring