Luboš Charčenko, Developer in Brno, South Moravian Region, Czech Republic
Luboš is available for hire
Hire Luboš

Luboš Charčenko

Verified Expert  in Engineering

Bio

Lubos is a Python developer and skilled solution architect with 8+ years of experience leading teams in the development of high-load systems. He's also worked as a founding engineer and chief architect at the startup, Kiwi.com. Lubos excels at MVPs, distributed systems, performance optimization, web scraping, and APIs and has extensive experience with large database clusters like PostgreSQL, Elastic, and Cassandra. Lubos is driven by the philosophy, "Do it with passion or don't do it at all."

Portfolio

Fintech Company
Python, ScyllaDB, Apache Kafka, Datadog, PagerDuty, TensorFlow, Keras, Pandas...
Kiwi
Python, Node.js, JavaScript, Git, Jira, Flask, ScyllaDB, Docker, Kubernetes...
Kiwi
PostgreSQL, Apache Cassandra, Apache Kafka, RabbitMQ, ScyllaDB, Kubernetes...

Experience

Availability

Part-time

Preferred Environment

Python, Linux, Git, Docker, Amazon Web Services (AWS)

The most amazing...

...product I've created is Nomad at Kiwi.com. It's a unique travel search tool for planning multi-city trips, scanning every travel combination for the best price.

Work Experience

Founder

2021 - PRESENT
Fintech Company
  • Developed a pipeline for consuming and storing all ticks from the US stock market, which consisted of around 1 million messages per second.
  • Built a pipeline for preparing various different machine learning datasets for training and testing neural networks.
  • Experimented with many neural network architecture types and developed a trading strategy based on convolutional neural networks (CNN) and reinforcement learning (RL).
  • Developed a trading agent that could evaluate the results of a predicting algorithm and redistribute the portfolio every day if necessary.
Technologies: Python, ScyllaDB, Apache Kafka, Datadog, PagerDuty, TensorFlow, Keras, Pandas, NumPy, Celery, WebSockets, Docker, Kubernetes, Redis, GitLab CI/CD, Git, Flask, REST APIs, NoSQL, Convolutional Neural Networks (CNNs), Back-end Development, Architecture, Data Structures, Automation, Back-end, AWS Cloud Architecture, Linux, Software Architecture, GitLab, Distributed Architecture, DevOps, Machine Learning, Deep Reinforcement Learning, Requests, Regular Expressions, Amazon EC2, Amazon Simple Notification Service (SNS), TensorBoard, Jupyter, CSS, HTML, APIs, Amazon Web Services (AWS), API Integration, Data Science, Bash

Chief Automation Officer

2020 - 2020
Kiwi
  • Managed the COVID-19 crisis to minimize the pandemic's impact on the company.
  • Optimized the management of canceled reservations and led the team to develop fully automated refunds in a dramatically short time.
  • Created the departmental strategy followed by reorganization.
  • Set prioritization and project vetting guidelines based on the return on investment (ROI).
  • Handled cross-department and department-to-top-management communication.
Technologies: Python, Node.js, JavaScript, Git, Jira, Flask, ScyllaDB, Docker, Kubernetes, Pandas, NumPy, SciPy, Vault, NoSQL, Web Scraping, Scraping, Back-end Development, Architecture, Data Structures, Relational Data Mapping, Automation, Back-end, AWS Cloud Architecture, Linux, GitLab CI/CD, GitLab, Distributed Architecture, DevOps, REST APIs, Datadog, PagerDuty, Redis, Machine Learning, SQLAlchemy, Google Cloud Platform (GCP), Agile, Proxy Servers, Regular Expressions, Proxies, Amazon EC2, TensorBoard, Jupyter, Chrome, CSS, HTML, Puppeteer, APIs, Amazon Web Services (AWS), Data Science, Software as a Service (SaaS), Technical Leadership, Technical Project Management

Chief Architect

2015 - 2020
Kiwi
  • Established the technical direction for the core teams.
  • Migrated search team services to Docker and created CI/CD pipeline for building, testing, and deployment to AWS.
  • Led the evolution of a search engine through multiple technologies, including AWS Redshift, Elastic, Cassandra, ScyllaDB, and a custom C++ in-memory database.
  • Implemented Asyncio into the search APIs resulting in more even utilization of resources.
  • Led the service migration to AWS, resulting in a hybrid architecture that used the best of both worlds; bare metal for services needing raw computation power and AWS for those needing security, scalability, elasticity, and availability.
  • Invented the Nomad product from idea to production, a unique travel search tool for planning trips to multiple destinations. Nomad scouts every possible travel combination on a multi-city trip to find the lowest possible prices.
  • Created a team of 10 people and developed a B2B platform called Tequila, all within ten weeks.
  • Served as the main point of contact for impossible problems.
Technologies: PostgreSQL, Apache Cassandra, Apache Kafka, RabbitMQ, ScyllaDB, Kubernetes, Docker, Agile, Software Architecture, AWS Cloud Architecture, Python, Redshift, Node.js, JavaScript, GitLab CI/CD, Asyncio, Google Cloud Platform (GCP), Jira, Git, SQLAlchemy, Flask, Vault, Requests, Rancher, Memcached, NoSQL, Elasticsearch, DevOps, Web Scraping, Scraping, Back-end Development, Data Structures, Architecture, Relational Data Mapping, Automation, Back-end, Linux, GitLab, Distributed Architecture, REST APIs, Datadog, PagerDuty, Redis, Proxy Servers, Beautiful Soup, Regular Expressions, Proxies, Amazon EC2, Chrome, CSS, HTML, Puppeteer, APIs, Amazon Web Services (AWS), Localization, API Integration, Software as a Service (SaaS), Technical Leadership, Bash, Technical Project Management, Scrapy

Founding Developer | Co-owner

2013 - 2015
Kiwi
  • Laid down the technology foundation for future growth and development.
  • Created the first performant version of the search engine based on a highly optimized PostgreSQL cluster, combining flights into virtual interlining and capable of doing wide-range searches with high complexity.
  • Developed a secured reservation API inclusive of implementing services like payment providers or anti-fraud solutions.
  • Designed and developed a custom distributed modular web-scraping system with a complex scraping planning algorithm.
  • Developed a data pipeline that could merge data from multiple sources according to complex business rules to preprocess flight data before storing it in a database.
  • Architected and built a modular system for handling reservation and post-reservation automation through airline websites, APIs, and global distribution systems (GDS).
  • Onboarded, mentored, and led a team of new developers.
Technologies: Python, PostgreSQL, MongoDB, Redis, NGINX, Flask, Django, SQL, Git, Linux, JavaScript, Ansible, Agile, Software Architecture, Requests, MySQL, DevOps, Web Scraping, Scraping, Back-end Development, Architecture, Data Structures, Relational Data Mapping, Automation, Back-end, Distributed Architecture, REST APIs, Datadog, PagerDuty, SQLAlchemy, Memcached, Proxy Servers, Beautiful Soup, Regular Expressions, Proxies, Jinja, Podio, APIs, Localization, API Integration, Technical Leadership, PostGIS, Bash, Technical Project Management, Scrapy

Full-stack Developer

2013 - 2013
Colectora Software
  • Developed the front- and back-end functionality for customer support, which monitored who was working, how long, and on what cases plus it also restricted access so only one agent can open certain case at a time.
  • Created custom automated weekly reports from Jira and Git to automatically monitor progress and estimate the cost of features.
  • Deployed and implemented the DevOps of my own applications.
Technologies: PHP, JavaScript, SQL, Linux, NGINX, Jira, APIs, WebSockets, Git, MySQL, DevOps, Agile, Back-end Development, Data Structures, Back-end, CSS, HTML, Amazon Web Services (AWS), API Integration, Technical Leadership

Lead Back-end Developer

2011 - 2013
MSI International
  • Architected and implemented a custom CI/CD pipeline for firmware, resulting in a 50% faster deployment cycle.
  • Constructed and designed the new architecture for a core system as well as the extensive refactoring of the codebase, resulting in a 70% reduction in new feature development costs within the affected scope.
  • Introduced Git for company-wide version tracking instead of SVN.
  • Grew from a back-end engineer to a department lead in just two years.
  • Led a team of talented developers, which included planning and executing projects, employee reviews, and career ladders.
  • Fostered effective stakeholder communication within the department.
Technologies: Java, MySQL, Linux, Git, DevOps, Back-end Development, Architecture, Back-end, Automation, Technical Leadership

Special Iframe Proxy

One company I was cooperating with, let's call it Xteam, was selling services from various portals.

Once a customer bought a specific service, Xteam went and created an account for the user within the portal, bought the service, and provided it back to the user with a margin, of course.

However, at some point, Xteam wanted to empower clients by enabling users to interact with a portal without handing over credentials for some services.

Lastly, some portals had just one login for all the customers. I invented and built a proxy with multiple functions to solve these issues.

Proxy Functions:
• It allowed the opening of nearly any page through an iframe by adjusting the headers.
• It could perform hidden automated actions, like login without showing credentials to the end user.
• It rerouted all the links back to itself with mapping to the original links, so the proxy was aware of all the traffic from the site opened through an iframe. That way, it could restrict actions within portals and hide or change some parts of the portal, like filtering only cases related to a specific user for single account portals.
• It had logging interactions that could propagate changes back to the main system.

PostgreSQL-based Multi-leg Flight Search

I designed and developed a version of a flight-combining algorithm based on PostgreSQL triggers.

After a new flight is added or an existing one is updated, the database automatically checks the new combination's prices and validity and generates new ones or deletes the invalid ones if needed.

At the time, the database was processing 2,000 updates per second, which cascaded into tens of thousands of updates every second on top of 300 million existing combinations.

For the search, I used cascaded replication across multiple search-optimized PostgreSQL nodes.

Automated Trading System with Reinforcement Learning and Convolutional Neural Networks

I developed a portfolio management system using CNN and RL. The system handles over 1 million updates per second with stateful workers grouping data for aggregated representations, which are then stored in ScyllaDB with over 10 billion rows and growing.

Tech: Python, Kafka, ScyllaDB, Celery, Redis, Kubernetes, Datadog

Data is crunched by a pipeline to create 4D representations of vital features in space to create historical and real-time datasets for convolutional neural networks and deep reinforcement learning.

Tech: Python, Pandas, NumPy

I tried a lot of different architectures, with the best based on CSPDarknet53 ideas. First, I implemented deep reinforcement learning with parallel independent agents training the same network able to handle the whole portfolio at once and distribute capital optimally between investing and hedging.

A secure trader works with predictions and communicates with a broker API; it distributes capital between multiple algorithms and works with multiple trading accounts—monitoring, alerts, and an emergency kill switch with SMS.

Tech: EC2, IB API, AWS 2-way SMS, PagerDuty, Datadog, Python, Asyncio, Pandas, NumPy, Matplotlib, TensorBoard, TensorFlow, Keras, Jupyter

Reverse-engineering Websites and Fighting Anti-scraping

I've written about 200 functional scraping modules using various technologies.

Sometimes websites fight back with CAPTCHA programs, IP, and even recording mouse movements or exact click locations on elements.

Fighting those protections while keeping modules performant and saving as much as possible on traffic forcefully pushes my creativity to an entirely different level.

Technology Stack: Under NDA

Low Code Web Automation

I invented a type of software enabling complex web interactions without needing any coding to offload work from the engineering team.

The system uses precoded elements, which extract data, interact with the page, and drag-n-dropped into the site automation plan. The system is designed to bypass CAPTCHA programs or other anti-scraping techniques.

Technology Stack: Under NDA
2007 - 2011

High School Diploma in Computer Science

European Polytechnic Institute - The Czech Republic

Libraries/APIs

REST APIs, TensorFlow, SQLAlchemy, Beautiful Soup, Keras, Pandas, NumPy, Requests, Puppeteer, Node.js, SciPy, Asyncio, Interactive Brokers API, Matplotlib, Protobuf

Tools

GitLab CI/CD, GitLab, Git, NGINX, Celery, Jira, RabbitMQ, Ansible, Vault, Amazon Simple Notification Service (SNS), TensorBoard, Jupyter, Podio

Languages

Python, SQL, CSS, HTML, Bash, JavaScript, PHP, Java

Platforms

Linux, Docker, Amazon Web Services (AWS), Kubernetes, Apache Kafka, PagerDuty, Rancher, Amazon EC2, Google Cloud Platform (GCP)

Storage

PostgreSQL, Datadog, NoSQL, Redis, ScyllaDB, Elasticsearch, MySQL, Memcached, Database Replication, PostGIS, Redshift, MongoDB

Frameworks

Flask, Django, Jinja, Scrapy, Chrome

Paradigms

DevOps, Agile, Automation

Other

Software Architecture, AWS Cloud Architecture, Distributed Architecture, Web Scraping, Scraping, Back-end Development, APIs, Back-end, Architecture, API Integration, Technical Leadership, Technical Project Management, Apache Cassandra, WebSockets, Machine Learning, Convolutional Neural Networks (CNNs), Proxy Servers, Relational Data Mapping, Data Structures, Localization, Data Science, Software as a Service (SaaS), Deep Reinforcement Learning, Triggers, SSL Certificates, Regular Expressions, Proxies, Iframes

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring