Henrik Svensson, Developer in Montreal, Canada
Henrik is available for hire
Hire Henrik

Henrik Svensson

Verified Expert  in Engineering

Back-end Engineer and Machine Learning Developer

Montreal, Canada
Toptal Member Since
September 8, 2020

Henrik is a machine learning and back-end engineer who primarily uses Python and technologies such as TensorFlow, PyTorch, Flask, Django, and Docker. He enjoys tackling complex projects and working on cloud deployment and databases. Henrik prides himself on creatively solving problems and quickly adapting to new teams and environments.


Tad Slaff Consultancy Services
Flask, Python, AWS Lambda, APIs, Documentation, Amazon S3 (AWS S3), OAuth 2...
Enkidoo AI
Python, Flask-RESTX, REST, TensorFlow, Machine Learning, SQL, NoSQL, Pandas...
PyTorch, TensorFlow, Python, Computer Vision, Artificial Intelligence (AI)...




Preferred Environment

PyTorch, TensorFlow, Python, Visual Studio Code (VS Code), Linux

The most amazing...

...project I've worked on was a project in behavioral cloning during a nano degree on self-driving cars from Udacity.

Work Experience

Flask Developer

2023 - 2023
Tad Slaff Consultancy Services
  • Developed a Flask app hosted in AWS Elastic Beanstalk.
  • Made endpoints to fetch data from multiple different APIs, handling OAuth2 flows, and made it easy for the client to extend it in the future.
  • Connected the app to a database to handle user data and logging validation errors.
Technologies: Flask, Python, AWS Lambda, APIs, Documentation, Amazon S3 (AWS S3), OAuth 2, Federated Sign-in, Amazon EC2, AWS Elastic Beanstalk, Shopify, Google Analytics API, Facebook API, Amazon Cognito

AI Engineer

2020 - 2022
Enkidoo AI
  • Worked on a service for semi-automatic column mapping, consisting of an AI model that used the content in a CSV file to try mapping what each column is. It used deep learning, some handcrafted feature engineering, and lookup tables.
  • Worked on forecasting models to predict sales and inventory. Also, led a team of student consultants during this project.
  • Architected and developed a service that extracts tickets and emails from a CRM system, extracts key content from those, tries to prioritize customers that need help quickly or are irate, and alerts the support team.
  • Created and architected an automatic end-to-end migration service between various POS systems. This was made using Flask.
Technologies: Python, Flask-RESTX, REST, TensorFlow, Machine Learning, SQL, NoSQL, Pandas, Natural Language Toolkit (NLTK), Generative Pre-trained Transformers (GPT), Natural Language Processing (NLP), Forecasting, Redis, Apache Airflow, Apache Beam, Artificial Intelligence (AI), PostgreSQL, Django, Microservices, REST APIs, GraphQL, Architecture, APIs, Leadership, Kubernetes, Docker, NumPy, MySQL, Google BigQuery, Data Science, POS, Machine Learning Operations (MLOps), Data Visualization, SQLAlchemy, FastAPI, Elasticsearch, Amazon S3 (AWS S3), AWS Lambda, Redis Cache, Amazon EC2, Django REST Framework, Deep Learning, Data Analytics, Back-end, Shopify API, SDKs, PIP, Software Packaging, Lightspeed, QuickBooks API, API Integration, JSON, Image Generation, SARIMA, ARIMA, ARIMA Models, LSTM, BigQuery, Cloud, Image Recognition

AI Engineer

2018 - 2020
  • Detected if a document had a fraudulent physical manipulation while keeping a false positive level below 0.001% (one wrong in 10,000 samples).
  • Developed a simple model to detect if a decent quality document is present or not. This was achieved with >98% true positive with a false positive below 0.001%.
  • Extracted and analyzed BI-related information to the other AI-engineers.
Technologies: PyTorch, TensorFlow, Python, Computer Vision, Artificial Intelligence (AI), Image Processing, REST APIs, APIs, Machine Learning, Amazon Web Services (AWS), Docker, Pandas, NumPy, Object Detection, Data Visualization, OpenCV, Deep Learning, Data Analytics, Back-end, PIP, JSON

IT Consultant

2018 - 2018
Sigma - Ericsson
  • Developed tools for Ericsson's continuous integration system and added new functionalities to ensure better code from developers.
  • Tracked and fixed various bugs using Jira as a reporting tool.
  • Maintained tools for Ericsson's continuous integration system.
Technologies: Bash Script, Jenkins, Python, PIP

IT Consultant

2017 - 2018
Sigma - Ascom
  • Implemented continuous integration workflow for Ascom's message server by simulation android devices and message loss to ensure 99.999% reliability.
  • Automated short tests on each commit, mainly to test that the code still runs and works properly.
  • Automated a long load test to verify the complete system including Android devices and message serving.
Technologies: Bash Script, Jenkins, Git, Android Debug Bridge, Pytest, Python, Health, PIP, Software Packaging

Software Engineer in Test

2011 - 2014
Saab AB
  • Extended functionality in a control system for an airborne ground/foliage penetration radar system. Mainly made it possible for developers to be able to use new functionality while doing field tests.
  • Developed test system for JAS-Gripen Radar system using various hardware such as network analyzer, to gather data to later perform test and verification of hardware of the next generation of flight radar.
  • Extended and improved functionality for test rig to perform verification tests on big radar systems. Including finding bugs and improvements that saved the users over a week of work.
  • Built test applications to help developers to test and verify functionality and performance of Advanced LIDAR systems, this included a user interface where the user could control the whole system.
Technologies: Network Analysis, System Testing, Electronics, LabVIEW, MATLAB, Software Packaging

Intelligent Column Mapping for CSV Data Import

Developed a service for semi-automatic column mapping in CSV files from POS systems. The service was designed to handle multiple types of CSV files, including sales, customers, and items. The AI model used a combination of deep learning, feature engineering, and lookup tables to accurately identify each column's content.

I also created a simple front end using Vue to allow users to interact with the service. I containerized the project using Docker and deployed it to the Google Cloud Platform. In addition, I set up a database to store the lookup table, collected data to build the lookup tables, and used a character-based long short-term memory to predict unknown values such as brand names and first names.

As the lead developer on this project, I implemented the AI model, designed the front end, and managed the deployment and database setup. This project required a strong understanding of machine learning and web development technologies.

Sales and Inventory Forecasting

Created a sales and inventory forecasting system using various machine learning models. The product incorporated a simple SARIMA model and other deep learning models. I used Flask-RESTX to build a web API for the project, which was then deployed to the Google Cloud Platform using Kubernetes.

As the lead developer on this project, I led a team of students and implemented and evaluated various machine learning models. This project required a strong understanding of data analysis and machine learning techniques, including Python and TensorFlow.

Intelligent Ranking System for Customer Support Tickets

Architected and developed an internal ranking system for customer support tickets that extracts tickets and emails from a CRM system. It uses NLP techniques to extract key content and prioritize customers based on the urgency and emotions expressed in their requests.

The system ranks tickets and alerts the support team to the most pressing and emotionally charged cases. As the lead developer on this project, I handled the design and implementation of the service and integrated it into the CRM system. This project required strong skills in NLP, web development, and data management.

Improving Team Structure and Productivity

Joined the company and discovered a lack of structure that led to issues such as direct pushes to master without clear commit messages, conflicting pushes, and different coding styles. To address these issues, I proposed and implemented a new workflow for the team. This included using feature branches to ensure that code changes were adequately reviewed and tested before merging into the main branch.

I also introduced code reviews to help ensure that code changes are high-quality and compliant with the coding style. To achieve consistency in coding style, I suggested using Black, a formatter that ensures all code adheres to PEP8, the official Python coding style guide.

To further improve collaboration within the team, I introduced Scrum, which kept the team on track and focused on their goals. I also spent quite a lot of time on code reviews, providing constructive feedback and tips on improving their code. In this way, I helped improve the development team's overall quality and productivity and ensured that the codebase was clean and maintainable. In addition, I set up a PyPI server that enabled the team to use reusable libraries.

Automatic End-to-end POS System Migration Service

Handled the architecture and development of an automatic end-to-end migration service between various POS systems. The service was designed using a microservices architecture, with separate services for monitoring and data fetching, conversion, and uploading. I developed the data fetching and conversion services, which fetched data related to customers, items, and sales and converted it to our standard format.

I also conducted code reviews and mentored junior team members who developed the data uploading and monitoring services and the API service for communication with the front end. As the primary developer on this project, I played a crucial role in guiding the service development and ensuring its success. This project required a strong understanding of microservices architecture and data management.

Integrated Curbside Pickup POS System

Contributed to an integrated POS system that allowed local businesses to easily incorporate an online store for curbside pickup orders into their current POS system. The system allowed customers to order items online and pick them up without having face-to-face interactions.

This project was developed in response to the COVID-19 pandemic to make shopping safer and more convenient. The POS system was designed to be easy to use for both customers and business owners and required a strong understanding of web development and data management.

Fraudulent Document Detection Using Deep Learning

Built a system to detect fraudulent physical manipulations on documents, specifically punched holes. To maintain a low false positive rate of 0.001%, I collected data and trained and evaluated multiple models using deep neural networks and the Hough transform. I implemented a map to reduce false positives by assigning lower probabilities to common false positive locations.

The first version of the system, which was brought to production, achieved a high true positive rate of around 70% at the desired false positive rate. I also developed a second version using a different machine learning model that achieved even better results at around 90%, but this second version was not fully productionalized. I built a Flask service and containerized the project using Docker.

ID Document Detection Using a Convolutional Neural Network (CNN)

Programmed a machine learning model to detect the presence of high-quality ID documents in an early stage of the pipeline. Despite initially having a dataset with more than 10% incorrect labeling, I achieved a true positive rate of 98% and a false positive rate below 0.001%.

To address the mislabeled data, I worked with the annotation team to reannotate the dataset and implemented strategies to overcome the incorrect labels. This project required strong problem-solving skills and the ability to work closely with a team to ensure accurate results.

Integration Testing and Tools Development for Ascom's Message Server

Developed integration tests and tools for Ascom's message server to ensure 99.999% reliability. This included creating a script to simulate a large number of devices sending and receiving high loads using dedicated hardware and a simple Python simulator using multiprocessing. I also remotely controlled several Android devices using the Android Debug Bridge to verify performance and utilized pytest and Python to develop these tests and tools.

Capstone Data Scientist Nanodegree

Analyzed simulated historical Starbucks data to determine what influences customers to make purchases and the best offers to send them as part of my data science nanodegree at Udacity.

Once I explored customer profiles, a transcript of events, and information on available offers, I found that offers were sent out in bursts, and transactions increased after an offer was sent. I also looked at customer spending habits and found that most customers spent around $20 per month, but because of outliers, the mean was much higher at $107, and the median was $72.

I analyzed the available offers and plotted their distribution among customers. Then, I used machine learning to build a model to predict whether a customer would make a purchase. The best-performing model was a gradient-boosting classifier with an 0.86 F1 score. I used this model to make recommendations on which offers to send to different segments of customers.

For more details, visit the project URL.

Semantic Segmentation for Self-driving Cars

Applied my skills in deep learning and computer vision to build a model that classifies each pixel in an image into one of several categories. This project was a challenging and rewarding experience that was part of Udacity's Self-driving Car Engineer nanodegree.

While working on this project, I used the Python programming language and libraries, such as TensorFlow, to train a CNN to perform semantic segmentation. I worked with real-world data and optimized my model to achieve the best possible performance. This required experimenting with different network architectures, hyperparameters, and techniques, such as data augmentation and regularization.

Throughout the project, I focused on my learning and development, researching and studying various concepts and techniques related to semantic segmentation and applying what I learned to the project. By the end of the project, I developed a deep understanding of this topic and gained the skills and knowledge needed to tackle other challenging problems in the field of self-driving car technology.
2014 - 2016

Master's Degree in Complex Adaptive Systems

Chalmers University of Technology - Gothenburg, Sweden

MARCH 2021 - MARCH 2023

Professional Data Engineer

Google Cloud


Data Scientist (Nanodegree)



Self-Driving Cars (Nanodegree)



Keras, Pandas, REST APIs, TensorFlow, PyTorch, NumPy, SQLAlchemy, OpenCV, QuickBooks API, LSTM, Natural Language Toolkit (NLTK), Vue, SpaCy, Scikit-learn, Shopify API, Google Analytics API, Facebook API


Git, Pytest, BigQuery, Android Debug Bridge, Jenkins, MATLAB, LabVIEW, Flask-RESTX, Apache Airflow, Apache Beam, PyPI, Amazon Cognito


Flask, Django, Django REST Framework, Android SDK, OAuth 2


Python, Python 3, SQL, Bash Script, JavaScript, GraphQL


REST, Agent-based Modeling, Microservices, Data Science, CRISP-DM, ETL, Scrum


Docker, Linux, Visual Studio Code (VS Code), Amazon Web Services (AWS), Kubernetes, AWS Lambda, Amazon EC2, Google Cloud Platform (GCP), Android, AWS Elastic Beanstalk, Shopify


JSON, Redis, MySQL, Redis Cache, NoSQL, Google Cloud, Databases, PostgreSQL, Elasticsearch, Amazon S3 (AWS S3), Cloud Firestore


Machine Learning, Computer Vision, Artificial Intelligence (AI), APIs, POS, Back-end, PIP, Software Packaging, Lightspeed, API Integration, System Testing, Planning, Deep Learning, Forecasting, Image Processing, Web Scraping, Architecture, Google BigQuery, Object Detection, Data Visualization, Data Analytics, SDKs, SARIMA, ARIMA, ARIMA Models, Cloud, Image Recognition, Image Generation, Robotics, Information Theory, Electronics, Network Analysis, Robot Operating System (ROS), Sensor Fusion, Localization, Recommendation Systems, Funk SVD, ELT, Natural Language Processing (NLP), Health, Scraping, Leadership, Machine Learning Operations (MLOps), FastAPI, Simulations, Mentorship, IT Project Management, Generative Pre-trained Transformers (GPT), Documentation, Federated Sign-in

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.


Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring