Róbert Schmidt
Verified Expert in Engineering
Data Engineer and Database Developer
Róbert is a professional focused on data engineering and business intelligence using Azure Cloud Services and Microsoft on-premises solutions. His experience spans database development, traditional BI approaches, big data, and streaming solutions. He has developed big data solutions, collected terabytes of data, and provided a baseline for strategical decision-making, mainly with Apache Spark and Azure Databricks. He has worked with companies such as EY, Swiss Re, and Endeavor.
Portfolio
Experience
Availability
Preferred Environment
Azure, Databricks, Azure SQL Databases, Azure Data Factory, Azure DevOps, Azure Data Lake, Delta Lake, Microsoft SQL Server, Data Engineering, Big Data
The most amazing...
...tool I've provided is a highly scalable analytical platform consuming terabytes of data and using statistical methods in the online application.
Work Experience
Lead Azure Data Engineer
Sodexo
- Led a team of eight developers in various data engineering features. I was involved in building the architecture of a new global data platform for the company.
- Owned two legacy data platforms. The first platform was the traditional data warehouse model, and the second one was a unified data lake framework for sharing data across the company.
- Enabled the new data platform to provide an end-to-end platform for any kind of analytical project in the company, following all security best practices and regulations. It is supposed to be used for more than 100 projects in two years.
- Implemented observability features to monitor and alert every step of data movement and semantic layer for easy access to any kind of data for users with minimal knowledge of data engineering.
Azure Data Engineer
Swiss Re
- Led a team of four data engineers to design and develop all data-related stories in the project.
- Built a massive data processing solution with Azure Databricks for several use cases and statistical models. These scenarios could consume terabytes of data, and they were still highly scalable, even for smaller datasets, in an efficient way.
- Designed and developed multiple ETL pipelines and data flows in Azure Data Factory. These pipelines were also integrated directly with Delta Lake and decreased the complexity of integration with RMDB databases and other sources.
- Designed the database model, optimized T-SQL queries, and provided maintenance and performance improvements. The front-end app used the DB and needed a latency of up to 500 ms of loading time during big workloads on the back end.
Azure Data Engineer
Rare Crew
- Designed and developed solutions for collecting data from different types of sources like APIs, relational and non-relational databases with Azure Databricks, Azure Functions, and Azure Data Factory.
- Orchestrated ETL pipelines with Azure Data Factory and Azure Event Grid with different scenarios. We used event-based triggers and pushed data forward to integrated services in near-real time, such as Adobe Experience Cloud, for marketing purposes.
- Designed and implemented data warehouse in Azure SQL Data Warehouse, now known as Azure Synapse. The data warehouse was optimized for the fast loading of big datasets into our SSAS service with PolyBase.
ETL Developer
Accenture
- Implemented ETL processes in SSIS and T-SQL from different sources such as Microsoft SQL, Excel, or CSV files. The solution decreased loading time and the repeatable manual workload of several employees.
- Improved reports and visualizations according to provided requirements in Power BI to bring a better understanding of data to upper management.
- Analyzed the previous implementation of data quality solutions and reporting services. Provided feedback and suggested some possibilities that were implemented afterward.
Software Engineer
Freelance
- Developed an Android mobile application with a photo focus-stacking algorithm. It was an experimental project for new optical components integrated with mobile phones. The prototype of the solution was delivered and enhanced by the company.
- Developed a JSF web application for a civil engineering company with up to 50 people. We developed a basic time management system with cloud storage integrated into the application. It helped to share data efficiently and keep them secure.
- Built these projects with a team of three people. I designed the database and storage architecture.
SQL Developer
EY
- Delivered a new generation of auditing software for dozens of financial institutions in Central and Eastern Europe. The latest version of the application decreased average processing time by 90–95%.
- Moved most of the exhaustive data logic to the database layer of Microsoft SQL. It reduced processing time, highly improved user experience, and brought new clients to our customer.
- Provided consulting to customers and gathered their additional requirements needed in our product. According to the outputs from meetings, we developed customized ETL pipelines for ingesting and cleaning data.
Experience
Data Warehouse for a Streaming Platform
https://www.endeavorstreaming.com/The solution needed to process a massive amount of data and be well scalable because of significant peak hours as tens of thousands of users used the platform at the same time, and there was significant growth of users expected. We loaded, cleaned, and processed these data into a dimensional model with Azure Databricks and data were used for KPI dashboards and further analytics with Azure Analytics Services. I integrated several third-party services with the solution, and integration with Adobe Cloud Experience services received the required information in near real-time using Azure Functions and Event Grid. It provides an opportunity to build marketing strategies based on the actual interaction with the platform.
Enterprise Value Targeting
Education
Master's Degree in Information Systems and Technologies
University of Economics - Prague, Czech Republic
Exchange Program in Information Technologies
The Hong Kong University of Science and Technology - Hong Kong, China
Bachelor's Degree in Applied Informatics
Slovak University of Technology - Bratislava, Slovakia
Certifications
Databricks Certified Associate Developer for Apache Spark 3.0
Databricks
Microsoft Certified: Azure Fundamentals
Microsoft
Microsoft Azure Data Engineer Associate
Microsoft
MCSA: SQL 2016 Database Development
Microsoft
Personnel Security Clearance Certificate | Classification level: Secret
National Security Authority
Skills
Tools
Git, Microsoft Power BI
Paradigms
ETL, Azure DevOps, Business Intelligence (BI), Scrum
Languages
T-SQL (Transact-SQL), SQL, Python, Java
Storage
Azure SQL Databases, Microsoft SQL Server, Databases, Azure SQL, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS)
Platforms
Databricks, Azure, Azure SQL Data Warehouse, Azure Functions, Dedicated SQL Pool (formerly SQL DW)
Frameworks
Apache Spark, Spark
Other
Azure Data Factory, Data Warehousing, Data Engineering, Azure Data Lake, Delta Lake, Data Visualization, Big Data, Azure Databricks, Dremio
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.
Start hiring