Nika Dogonadze, Data Scientist and Developer in Tbilisi, Georgia
Nika Dogonadze

Data Scientist and Developer in Tbilisi, Georgia

Member since November 4, 2017
Nika has over five years of experience working as a freelance developer, specializing in Python, web scraping, data engineering, and machine learning. He has a bachelor's degree in computer science and mathematics, together with a great deal of experience in scraping data from multiple resources such as websites, multiple proprietary file formats, and various types of databases. Nika is personable, communicates exceptionally well, and has worked with small-to-medium-sized teams.
Nika is now available for hire


  • The Story Market
    Python 3, Apache Airflow, Docker, Docker Compose, Splash, Scrapy, MongoDB...
  • Bar-All
    Python 3, XML, MVC Design, Web, Odoo,, Apps, PostgreSQL, APIs
  • Marketsonics LLC
    Amazon Web Services (AWS), Bash, PostgreSQL, Terraform, ETL, CSV, Slack App...


Tbilisi, Georgia



Preferred Environment

Jupyter Notebook, VS Code, PyCharm, IntelliJ IDEA, Windows, Linux

The most amazing...

...project I've created is a state-of-the-art face forgery detection model (available at


  • Senior Data Engineer

    2022 - PRESENT
    The Story Market
    • Successfully deployed the entire ETL job infrastructure on AWS using containerized apache airflow.
    • Refactored existing data handling code, making it 8x smaller.
    • Developed a framework on top of Apache Airflow to reduce the cost of adding new publishers to The Story Market network.
    Technologies: Python 3, Apache Airflow, Docker, Docker Compose, Splash, Scrapy, MongoDB, AWS, AWS S3, AWS EC2, AWS Lambda, AWS RDS, PostgreSQL, Pandas
  • Software Developer

    2021 - 2022
    • Successfully led the Odoo application database migration for the upgrade process from version 13 to version 14.
    • Refactored and upgraded legacy Odoo applications to make them work in a more modern Odoo 14 environment.
    • Implemented the Odoo application customization to make day-to-day business operations seamless and less error-prone.
    Technologies: Python 3, XML, MVC Design, Web, Odoo,, Apps, PostgreSQL, APIs
  • Data Engineer

    2020 - 2022
    Marketsonics LLC
    • Refactored existing ETL Python code, achieving a 10x reduction in the total lines of code, vastly improving readability and maintainability.
    • Designed and implemented custom cloud architecture for handling specific ETL workloads very efficiently and cheaply using Apache Airflow, AWS batch, and Docker.
    • Set up automatic alerts in case of any failure during periodic jobs.
    Technologies: Amazon Web Services (AWS), Bash, PostgreSQL, Terraform, ETL, CSV, Slack App, Bitbucket, Apache Airflow, AWS, Docker, Python 3
  • Full-stack Developer | Machine Learning Engineer

    2019 - 2020
    inovex GmbH
    • Developed a fully functional image-captioning service with a team of four, including a web page and a highly available REST API.
    • Implemented and trained an image captioning model from scratch based on the latest research papers in the field.
    • Designed the micro-service architecture for usable image captioning and deployment on the Google Cloud Platform (GCP).
    • Implemented continuous integration/development pipelines for all the microservices, using GitLab enterprise tools.
    • Wrote a blog post about project details, including information on technologies and management methodologies.
    • Established Google Cloud Budget alerts to automatically monitor and notify team members about possible budget overruns.
    Technologies: gRPC, Terraform, Google Cloud Platform (GCP), Helm, Kubernetes, Docker, Vue.js, JavaScript, PyTorch, Flask, Python
  • Master Course in Foundations of Data Engineering Tutor

    2019 - 2020
    Technical University of Munich
    • Conducted tutorial sessions with students explaining the most important aspects of the lecture and answering questions.
    • Held study sessions for students who needed individual help with the lectures and assignments.
    • Took part in grading assignments and final exams.
    Technologies: Spark, Scala, C++, Bash
  • Senior Software Developer | Data Scientist

    2016 - 2020
    • Developed a framework for creating and deploying dialog systems (chatbots) on Facebook Messenger.
    • Implemented numerous chatbots, the best of which had more than a million interactions and a second-day retention rate of 20%.
    • Created a graph-based web interface for easy assembly of custom chatbots by entering dialog texts and not a single line of code.
    • Wrote a chatbot for helping citizens wrongly fined by the police in Georgia. It would query users about the circumstances of the offense and automatically generate a personalized appeal PDF for submission to the court.
    • Developed a platform of custom real-time data analytics for a retail chain with more than 250 stores throughout Tbilisi.
    • Implemented a recommendation system for a large retail business. It would automatically generate special offers and gifts for loyal customers, using machine learning tools.
    • Created a web scraper to continuously gather publicly available data about all the parking tickets written in Tbilisi.
    • Designed and Implemented a data analytics web page on all the parking tickets in Tbilisi. It included a heatmap, other types of charts, and textual comments with analysis.
    • Built an API for guessing a person's nationality based on their first and last name, using deep natural language processing (
    • Built a REST API for transliterating from English to Georgian, using Flask and SQL.
    Technologies: Amazon Web Services (AWS), BigQuery, Google Cloud Platform (GCP), AWS, Flask, Django, Spark, Pandas, Kotlin, Scala, JavaScript, Java, SQL, Python
  • Python Developer

    2019 - 2019
    Elasticiti Inc
    • Refactored the existing Apache Airflow project to remove code duplication and make it easily extensible.
    • Examined existing ETL pipelines and tracked and fixed bugs.
    • Implemented, tested, and deployed a new ETL pipeline for handling daily updated raw client data.
    • Created a detailed wiki markdown documentation about my work and other parts of the existing project.
    Technologies: Snowflake, Markdown, Amazon Web Services (AWS), JSON, Apache Airflow, SQL, Requests, Pandas, Python
  • Senior Software Developer | Web Automation Engineer

    2018 - 2018
    The Rundown
    • Developed a scraping tool for automatically gathering live data from various sports-betting websites.
    • Implemented a REST API using Flask to make the sports-betting data easily accessible for other services.
    • Built automated testing/integration tools for easy and painless software updates.
    • Conducted daily stand-up meetings for all the developers to catch up with each other and plan the following day.
    • Developed a REST API to automatically and instantly place bets on various sports betting websites.
    • Implemented an algorithm to gather all the available sports-betting data and find and place profitable bets.
    Technologies: Amazon Web Services (AWS), Docker Compose, Docker, AWS, MySQL, Spring, Flask, Scrapy, Requests, HTTP, Selenium, Java, Python
  • Software/Web Scraping Engineer

    2016 - 2018
    • Implemented a data aggregation framework that would collect, process, and extract daily data about various online game stats to help the client (Jabre Capital Partners) with stock market trading decisions.
    • Developed a desktop GUI program to monitor and automatically purchase rare items in eCommerce shops when these items come in stock. The websites were Amazon, Walmart, Best Buy, and The Source.
    • Built a microframework in Python for writing auto-trader robots for cryptocurrencies, using the Coinigy API.
    • Wrote a Python program to use the AWS API and automatically manage tags for all the resources.
    • Developed a web crawler for collecting and storing discount coupon codes.
    • Created a tool for exporting slides as JPEG images from Microsoft PowerPoint using Python, Unoconv, and Convert.
    • Rewrote a large numerical analysis project from Visual Basic to modern Python 3 with careful testing to guarantee the same output.
    • Implemented and deployed a custom trading strategy on Python 3 and the Coinigy platform.
    • Wrote more than 20 small web-scraping programs using Python.
    Technologies: Amazon Web Services (AWS), PyQt 5, AWS, Spark, Scala, Selenium, SciPy, Scikit-learn, Pandas, Django, Scrapy, Python


  • Deep Face Forgery Detection

    I developed a deep learning-based tool for automatically detecting human face forgeries in videos and single frames. I built and trained the model from scratch based on theoretical insights from the latest research papers in the field. I tried multiple approaches, from regular neural networks to LSTMs with 3D convolutional networks as the encoder.

    A detailed description is available in the following paper:

  • Image Captioning Service (Web and API)

    I developed a microservice-based image captioning service with a team of four. I was involved in all parts of the projects, from the initial design of what the services should be to implementing each of them separately.

    The project was entirely developed with Agile methodologies and my role was to oversee the whole development process and take part in the actual coding/implementation.


  • Languages

    SQL, Python, Java, Kotlin, Python 3, Bash, JavaScript, C, Markdown, R, Scala, Snowflake, C++, XML
  • Frameworks

    Selenium, Spark, Flask, Scrapy, Django REST Framework, Django, Apache Spark, gRPC, Spring
  • Libraries/APIs

    Beautiful Soup, PyMongo, Pandas, NumPy, Requests, Node.js, Setuptools, Scikit-learn, PyQt 5, Spark ML, Apiary API, PyTorch, Google Cloud API, Vue.js, PhantomJS, Django ORM, TensorFlow, Twitch API, PyQt, SciPy, Matplotlib, PiLLoW
  • Tools

    Scraping Hub, GitHub, GitLab, Jupyter, JetBrains, GitLab CI/CD, Spark SQL, Tableau, Git, Apache Airflow, Google Kubernetes Engine (GKE), Terraform, BigQuery, IntelliJ IDEA, VS Code, Bitbucket, Helm, Docker Compose, Unoconv, Boto 3, Seaborn, PyCharm, MATLAB, Odoo
  • Paradigms

    Data Science, ETL, REST, Unit Testing, Agile, Testing, MVC Design
  • Platforms

    Jupyter Notebook, Docker, Google Cloud Platform (GCP), Spark Core, AWS EC2, AWS Lambda, Linux, Amazon Web Services (AWS), Kubernetes, Azure, ConvertKit, Linux CentOS 7, Android, Windows, Google Cloud SDK, Web, Splash
  • Storage

    AWS S3, Databases, Google Cloud Storage, NoSQL, PostgreSQL 10.1, Redis, MySQL, MongoDB, Google Cloud, PostgreSQL, JSON
  • Other

    Scraping, Data Scraping, Screen Scraping, Store Scraping, Natural Language Processing (NLP), Site Bots, Bots, Networks, HTTP, Supervised Learning, Classification Algorithms, Classification, Predictive Learning, Algorithms, Clustering Algorithms, Statistics, Software Development, Big Data, Machine Vision, Web Scraping, APIs, Deep Learning, Kubernetes Operations (Kops), Google BigQuery, Architecture, Data Engineering, Serverless, API Applications, Parquet, AWS, Machine Learning, HTTPS, Unsupervised Learning, Regression, Text Classification, Heuristic & Exact Algorithms, Optimization Algorithms, Genetic Algorithms, Mathematics, Computer Science, Artificial Intelligence (AI), Machine Learning Automation, ETL Tools, ETL Development, Slack App, CSV, Google Cloud Functions, Information Theory, Data Compression Algorithms, Cloud, Image Compression, Video Compression, Research, MLflow, Image Captioning,, Apps, AWS RDS


  • Master's Degree in Data Engineering and Analytics
    2018 - 2020
    Technical University of Munich - Munich, Germany
  • Nanodegree in Data Analytics
    2017 - 2017
    Udacity -
  • Bachelor's Degree in Computer Science
    2013 - 2017
    Free University of Tbilisi - Tbilisi, Georgia


  • Deep Learning Specialization
    JUNE 2018 - PRESENT
  • Data Analyst
    British Council

To view more profiles

Join Toptal
Share it with others