Nac Stoklas, Developer in Ljubljana, Slovenia
Nac is available for hire
Hire Nac

Nac Stoklas

Verified Expert  in Engineering

Data Scientist and Developer

Location
Ljubljana, Slovenia
Toptal Member Since
February 8, 2022

Nac is a data scientist who builds efficient ML and data pipelines. He is experienced in developing regression and classification models, scalable data pipelines, and presenting data in a way that offers the most insight. He has judged ML competitions and has participated in hackathons while helping host them. The wide range of projects Nac has worked on helps him understand his clients' needs, exceeding their expectations.

Portfolio

Outbrain
Go, Kibana, Amazon, BigQuery, Classification, Regression Modeling...
BE-terna
Artificial Intelligence (AI), Azure, Python, Web Scraping, Microsoft Excel...
Adacta d.o.o.
APIs, Python, Qlik Sense, QlikView, IIS, Microsoft Excel, Software Engineering...

Experience

Availability

Part-time

Preferred Environment

Jira, Slack, Visual Studio Code (VS Code), SQL Server Management Studio (SSMS), MacOS, IntelliJ IDEA

The most amazing...

...thing I've built is a cloud-based ML pipeline that can forecast sales for hundreds of thousands of items and set optimal stock levels in a matter of hours.

Work Experience

Data Scientist

2022 - PRESENT
Outbrain
  • Improved internal tooling is used for validating features for models used in production that make over a billion predictions daily.
  • Reworked parts of the systems that directly impact the most important business KPIs leading to a 5% increase in revenue (AB tested).
  • Improved existing models by leveraging unused past data with the help of amazing colleagues to increase spending, revenue, and ex-tac by a significant margin.
Technologies: Go, Kibana, Amazon, BigQuery, Classification, Regression Modeling, Apache Airflow, SQL, Python

Data Scientist

2020 - 2022
BE-terna
  • Built a pipeline for automated forecasting and scheduling of item deliveries for major retailers in the Adriatic and DACH regions.
  • Built a pipeline that makes and predicts sales for hundreds of thousands of items within a few hours using Databricks and the accompanying ecosystem on Azure. This resulted in a significant decrease in time for predictions for our clients.
  • Developed an algorithm to convert data from graphs to BI-friendly relational format (recursion in production, achievement unlocked).
  • Exposed APIs that allow forecasting of many different time series types and integrate seamlessly into many large ERP systems.
  • Built custom extensions for Qlik Sense and a product backlog item (PBI) connected to ERP systems, making the clients' daily workflow faster by over 65%.
  • Worked on time series analysis, detecting outliers, seasonal patterns, and more for a major pharmaceutical distributor.
Technologies: Artificial Intelligence (AI), Azure, Python, Web Scraping, Microsoft Excel, Distributed Systems, Software Engineering, Classification, Regression Modeling, JavaScript, Databricks, Spark, SQL

Data Scientist

2018 - 2020
Adacta d.o.o.
  • Built a basket analysis case for a chain of highway stores using association rules and prepared a dashboard for it.
  • Worked with clients to port and upgrade multiple applications from QlikView to Qlik Sense.
  • Built ETL pipelines that supply dashboards with hourly data updates used by multiple departments in a large Slovenian company.
Technologies: APIs, Python, Qlik Sense, QlikView, IIS, Microsoft Excel, Software Engineering, Classification, Regression Modeling, JavaScript

Python Developer

2018 - 2019
Faculty of Computer and Information Science - Ljubljana
  • Helped develop object detection for ski jumps using up to three cheap cameras.
  • Contributed to building the streaming pipeline for live measuring of ski jump length.
  • Developed an algorithm for automatically measuring ski jump distance length using environmental factors.
Technologies: Python, Artificial Intelligence (AI), Computer Vision, OpenCV, Keras, Software Engineering

IT Support

2017 - 2018
Maksim d.o.o.
  • Built a pipeline and services related to scoring and evaluating satisfaction surveys.
  • Helped the IT with updating software within the company and helping with the digital transformation.
  • Supported in-house engineers with time-sensitive complex IT tasks.
Technologies: Python, Qlik Sense, Windows

Automated Scheduler and Stock Optimizer

I helped build a fully distributed Azure cloud Databricks-based solution for retailers.

The solution segments items into different categories and makes predictions for scheduling optimal times considering deliveries, current stock levels, vendor preferences, and cargo requirements like shipping and truck space.

I built most of the scalable data pipeline and made prediction models. I also developed most of the supporting DevOps pipelines for CI/CD.

I've used cloud technologies such as Azure functions, Data Lake, Blob storage, Databricks, and others.

API Portal One Stop Shop for Developers

I built an API developer portal using Azure API management services.

Businesses use it daily to get high accuracy, low latency forecasts for their business needs. I also built several APIs on offer, including multiple methods for different types of forecasts, such as weather-based, window methods, statistical methods, and boosting-backed methods.

Basket Analysis for a Major Retailer

I built a basket analysis case using association rules and an exploration dashboard based on the basket analysis to optimize item placement within stores, offer better deals and find items that sell well together.

I also built a custom extension that allowed for write-back functionality within Qlik Sense, used for the dashboard.

Neo4j | Qlik Sense Connector

I built a connector that transforms a common graph database, specifically Neo4j concepts and hierarchies, to the relation database format used by all major BI tools.

It was integrated with Qlik Sense for a major Slovenian retailer to analyze their graph database data using standard BI tools.

DevOps Pipeline for Automated Product Delivery

I built an Azure CI/CD pipeline that supports project development for a whole ML team for one of their core products.

It automatically tests the code, builds artifacts, and deploys them on production clusters (Spark and Databricks), updating libraries and code when there is the least customer traffic.

Evaluation Dashboard in Qlik Sense

I built the ETL for evaluating the business performance of integrated systems and the dashboards required for generating business insight that helped improve related systems, including stock management, warehouse management, and cargo.

QlikView App Ported to Qlik Sense

I've ported the QlikView application with a complex ETL and dashboarding to Qlik Sense.

It entailed finding equivalents for many major QlikView features in Qlik Sense, adjusting the ETL so that it finds into the Qlik Sense script editor, and adjusting the visuals to fit into the newer engine.

Forecasting API

I've built an API that can be used for multiple and diverse types of time series forecasts.

This supports many parameters, such as weather and holidays (for ISO country names) and more fine-tuning for automated parameter selection. It can be easily embedded in major ERP solutions—FO, BC, NAV, and Infor. It offers support for statistical algorithms, boosting methods, and classical ML.

Object Detection for Ski Jumps

I've helped develop a cheap camera's CV object detection of a ski jumper. On top of this, I've worked on automatically calculating the jump distance using OpenCV, and I've worked to port this functionality onto a live video stream.

Languages

Python, SQL, JavaScript, HTML, TypeScript, Batch, Go

Libraries/APIs

Pandas, Azure API Management, NumPy, Scikit-learn, REST APIs, Azure Blob Storage API, XGBoost, CatBoost, Keras, PySpark, OpenCV, Spark ML, PyQt 5

Tools

Qlik Sense, Postman, Azure Machine Learning, Jira, Slack, Microsoft Teams, Microsoft Power BI, Microsoft Excel, Tableau, Azure Key Vault, Azure Application Insights, Azure DevOps Services, Docker Compose, Kibana, BigQuery, IntelliJ IDEA, Apache Airflow

Paradigms

Data Science, ETL, Business Intelligence (BI), REST, DevOps, Azure DevOps, Windows App Development, Desktop App Development

Platforms

Windows, Jupyter Notebook, Databricks, Azure, Azure Functions, Azure SQL Data Warehouse, QlikView, Docker, Amazon, Visual Studio Code (VS Code), MacOS, Dedicated SQL Pool (formerly SQL DW)

Other

Classification, Regression Modeling, Software Engineering, Machine Learning, Azure Databricks, Artificial Intelligence (AI), Time Series, Time Series Analysis, APIs, Data Engineering, ETL Tools, Data Analysis, Complex Data Analysis, Data Analytics, Simulations, Dashboard Design, Data Cleansing, Big Data, Analytics, Data Visualization, Data Modeling, Distributed Systems, Azure Data Lake, SOAP, Deep Neural Networks, Computer Vision, PBI Tools, Neural Networks, IIS, Web Scraping, Statistical Modeling, Association Rule Learning, CI/CD Pipelines, Data Quality Analysis, Data Cleaning, Weather, Videos, Predictive Modeling, Forecasting

Frameworks

Django, Spark

Storage

PostgreSQL, Azure SQL, Data Pipelines, Azure Blobs, Databases, Azure SQL Databases, SQL Server Management Studio (SSMS), PostgreSQL 10, Neo4j, Graph Databases, Azure Cloud Services

2013 - 2021

Master's Degree in Computer Science

University of Ljubljana - Ljubljana, Slovenia

OCTOBER 2021 - PRESENT

Microsoft Certified | Azure Data Fundamentals

Microsoft

OCTOBER 2021 - PRESENT

Microsoft Certified | Azure AI Fundamentals

Microsoft

OCTOBER 2021 - PRESENT

Microsoft Certified | Power Platform Fundamentals

Microsoft

JUNE 2021 - PRESENT

Microsoft Certified | Azure Data Scientist Associate

Microsoft

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring