Irakli Gugushvili, Developer in Tbilisi, Georgia
Irakli is available for hire
Hire Irakli

Irakli Gugushvili

Verified Expert  in Engineering

Python Developer

Location
Tbilisi, Georgia
Toptal Member Since
May 23, 2018

Irakli is a Python developer with seven years of experience in multiple industries. He started as a machine learning developer, expanded into a range of back-end technologies, and became an expert in web scraping. Irakli is also proficient in cloud technologies and currently freelancing as a senior Python cloud engineer and AWS and Azure architect. His industry experience is backed by a bachelor's degree in math and computer science.

Portfolio

HUB Security
Python, Django, Amazon Web Services (AWS), Google Cloud Platform (GCP)...
Staunch Technologies (via Toptal)
Amazon Web Services (AWS), Python, Azure
Olmait
Python, Flask, Azure

Experience

Availability

Part-time

Preferred Environment

PyCharm, Windows, Visual Studio Code (VS Code), Python, Git, Ubuntu

The most amazing...

...thing I've developed is my stock price prediction model using LSTM.

Work Experience

Senior Python Back-end Engineer

2021 - PRESENT
HUB Security
  • Created the back-end infrastructure for the penetration testing platform using Django.
  • Developed functionalities to create, run, and monitor bots with predefined attack scripts.
  • Extended platform functionality by adding the ability for users to create, test, run, and monitor attacks manually.
  • Implemented a bot geolocation functionality into the system.
  • Integrated Oracle Cloud as the new cloud provider.
Technologies: Python, Django, Amazon Web Services (AWS), Google Cloud Platform (GCP), Oracle Cloud

Senior Python Cloud Engineer | AWS and Azure Architect

2019 - 2023
Staunch Technologies (via Toptal)
  • Developed a system where a root user can monitor subusers' AWS activity.
  • Created a website for attaching Amazon Virtual Private Cloud (VPC) to an Amazon EC2 server without public access.
  • Built and developed a system for monitoring Azure users' activities.
Technologies: Amazon Web Services (AWS), Python, Azure

Senior Python Engineer

2021 - 2021
Olmait
  • Created ETL for the recommendation engine using Azure Functions.
  • Implemented back-end functionality for the recommendation engine using Flask.
  • Integrated FAISS to make the vector comparison for better recommendations.
Technologies: Python, Flask, Azure

Web Scraping Expert

2019 - 2021
Explorium
  • Developed many projects for scraping publicly available data using Scrapy, Selenium, Requests, BeautifulSoup, and other scraping technologies.
  • Created a fully functional pipeline for URL data collection, scraping, parsing, and storage.
  • Built a system that would run periodically and check the status of multiple website scrapers.
Technologies: Web Scraping, Python

Python Developer

2019 - 2020
Treehouse Technology Group (via Toptal)
  • Developed an email receiver and parser system deployed on AWS. It receives an email, parses its body, and calls different APIs depending on its content.
  • Created a data analyzer API that was responsible for analysis.
  • Developed a data prediction API that was responsible for predictions.
Technologies: Amazon Web Services (AWS), Flask, Python

Data Scraping Engineer

2019 - 2019
Yipit (via Toptal)
  • Developed many projects for scraping publicly available data.
  • Rewrote old scrapers using a new approach to increase coverage.
  • Created tester functionality to compare different scraping script coverage.
Technologies: Web Scraping, Python

Machine Learning Developer

2017 - 2019
Neiron
  • Created a model for stock market price predictions.
  • Built a backtesting environment for predictions in a lean engine.
  • Developed a paper trading (simulation) system using an Interactive Brokers server and Python API.
  • Constructed a sentiment analysis tool to increase prediction accuracy.
Technologies: Machine Learning, Python

Programmer

2016 - 2017
DoSo
  • Implemented different logic for insurance clients.
  • Developed and deployed a newer version of a reinsurance model to help clients.
  • Conducted tests for every model of insurance logic.
  • Developed a helper web application using ASP.NET MVC.
Technologies: ASP.NET MVC, C#

Programmer

2015 - 2016
Bank of Georgia
  • Implemented different types of logic in SQL for internal use.
  • Developed an application in Java for bank employees.
  • Tested the bank's functionalities on the application and database side.
Technologies: SQL

Penetration Testing Platform

We created a penetration testing platform where the user could run and monitor predefined attacks. The user could create a custom attack manually and test, run, and monitor it. We used Django for the back end, PostgreSQL for the database, AWS for the deployment and separate asynchronous tools, GCP for bot creation and running, and Elasticsearch for logs.

Azure Monitoring System

We developed this system to monitor Azure users' activities using built-in and custom policy combinations to check compliance. In the case of non-compliance, the system would alert the owner to solve it. We could also do the enforcement on our end if needed. We used Azure Logic Apps to start, Azure Functions to execute the logic, Azure Queue Storage to communicate between functions, and Azure Table to save the data.

AWS Monitoring System

This system allowed a root user to monitor the activities of sub-users. It ran at a time interval, collected sub-users activities, and enforced compliance if needed. We used several lambda functions, SQS for communication, DynamoDB for saving the data, Elasticsearch for logs, Jenkins for CI/CD, and Terraform for infrastructure as code.

Recommendation Engine

The project was about creating a recommendation engine for the video streaming platform. I created the entire infrastructure, starting from ETL using Azure Functions, the back-end API using Flask, and finally, vector comparison and prediction using Faiss.

Connector (via Toptal)

We created this website to attach a VPC with an NLB and TG setup to the EC2 instance with no public access. Using the server this way was much safer. I focused mainly on the back end and implemented the entire pipeline of attaching and detaching in AWS Step Functions. We used React with an API Gateway on the front end, Jenkins for CI/CD, and Terraform for infrastructure as code.

Public Data Scraper

This project consisted of scraping all kinds of publicly available data using Scrapy. I created the pipeline and all the different spiders for different data sources. The main challenge for static websites was parsing different pages; for dynamic websites, it was simulating a JavaScript request that loaded the data. We used Scrapinghub (now Zyte) for deployment and AWS S3 with DynamoDB to save the data.

Data Analysis and Prediction System (via Toptal)

In this project, I created a data analyzer and prediction system for different kinds of data using various tools. Then I created an API using Flask-RESTful and wrapping the system mentioned above into it.

RedString Bookmarking System (via Toptal)

This is a bookmarking system that saves URLs and their locations. We accomplished this by using Google's geolocation and weather data that was scraped and analyzed using the IBM text analyzer. We used Flask deployed on AWS Elastic Beanstalk as the primary technology.

Email Receiver/Parser System (via Toptal)

This system can receive an email, parse it, and call different APIs depending on its content. We deployed it on AWS using SES for notifications, S3 for saving the emails, RDS for saving logs, and Lambda for parsing and calling the API.

Deep Web Crawler (via Toptal)

This project focused on scraping the deep web to get medicinal information using Python Scrapy. I created and fixed crawlers along with the pipeline. The most challenging thing to handle in this project was the scale.

Building Tuner (via Toptal)

This project made it easier to manage university buildings because a large set of equipment constantly updated the building data. I created a database to store all the data and built data analysis mechanisms on top of that to process all the information. I also wrapped all that into a Python Dash web application and deployed it on Heroku.

Arabic Dialect Recognizer

This project analyzed and classified Arabic speech. Working on it was challenging, mainly because I don't speak Arabic. I had to do some extra work to correctly evaluate the different models and understand which was better.

LinkedIn Scraper

I have created many different LinkedIn scrapers during my career. Some gather personal and company data using advanced search features with the best result filtering. I have also implemented account rotation to avoid bans and automated the process by deploying crawlers on AWS.

Sentiment Analysis Tool

I created this sentiment analysis tool to increase the accuracy of stock market price predictions. It checks for tweets, analyzes them using Word2Vec and a CNN architecture model, and outputs the sentiment. This project was very challenging!

Stock Market Price Predictor

I created a model for stock market price predictions. Then I added backtesting using a lean engine. Finally, I set up an Interactive Brokers (IB) server and created an environment for paper trading (simulation) using the IB Python API.

Badminton Scraper and API

I created a scraper for badminton live scores, matches, draws, and players, and I built an API on top of that. The whole project was scheduled and deployed on Amazon EC2. This was one of the biggest scraping projects I've worked on.

Reinsurance Company Project

This was a huge project for reinsurance companies. I added full reinsurance functionalities in C#. The most difficult challenge was understanding the complex insurance logic and creating a reusable code.

Torrent Client

https://github.com/BartholomewKuma27/TorrentClient
In this project, I implemented a BitTorrent protocol that allowed users to create their own torrent clients. This was one of my first projects where I used Python and applied my knowledge of networking.

HTTP Server

https://github.com/BartholomewKuma27/HttpServer
This was an HTTP server implementation written in C. Working on this project was challenging and interesting because I had to implement almost everything from scratch, which expanded my technology experience.

Languages

Python, SQL

Frameworks

Flask, Selenium, Scrapy, Django

Libraries/APIs

Requests, Beautiful Soup, Pandas, Scikit-learn, Keras

Tools

Git, PyCharm, Jenkins, Terraform, Boto 3

Platforms

Amazon Web Services (AWS), Azure, Windows, Linux, Google Cloud Platform (GCP), Visual Studio Code (VS Code), Ubuntu

Storage

PostgreSQL, Amazon DynamoDB, Azure Table Storage, Elasticsearch, MongoDB, Oracle Cloud

Other

Web Scraping, Machine Learning

2013 - 2017

Bachelor's Degree in Math and Computer Science

Free University of Tbilisi - Tbilisi, Georgia

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring