Irakli Gugushvili, Python Developer in Tbilisi, Georgia
Irakli Gugushvili

Python Developer in Tbilisi, Georgia

Member since April 5, 2018
Irakli is a Python developer with seven years of experience in multiple industries. He started as a machine learning developer, expanded into a range of back-end technologies, and became an expert in web scraping. Irakli is also proficient in cloud technologies and currently freelancing as a senior Python cloud engineer and AWS and Azure architect. His industry experience is backed by a bachelor's degree in math and computer science.
Irakli is now available for hire

Portfolio

Experience

Location

Tbilisi, Georgia

Availability

Part-time

Preferred Environment

PyCharm, Windows, VS Code, Python, Git

The most amazing...

...thing I've developed is my stock price prediction model using LSTM.

Employment

  • Senior Python Cloud Engineer | AWS and Azure Architect

    2019 - PRESENT
    Stauch Technologies (via Toptal)
    • Developed a system where a root user can monitor subusers' AWS activity.
    • Created a website for attaching private VPC to an EC2 server with no public access.
    • Built and developed a system for monitoring Azure users' activities.
    Technologies: AWS, Python, Azure
  • Web Scraping Expert

    2019 - PRESENT
    Explorium
    • Developed many projects for scraping publicly available data using Scrapy, Selenium, Requests, BeautifulSoup, and other scraping technologies.
    • Created a fully functional pipeline for URL data collection, scraping, parsing, and storage.
    • Built a system that would run periodically and check the status of multiple website scrapers.
    Technologies: Web Scraping, Data, Python
  • Python Developer

    2019 - 2020
    Treehouse Technology Group (via Toptal)
    • Developed an email receiver and parser system deployed on AWS. It receives an email, parses its body, and calls different APIs depending on its content.
    • Created a data analyzer API that was responsible for analysis.
    • Developed a data prediction API that was responsible for predictions.
    Technologies: AWS, Flask, Python
  • Data Scraping Engineer

    2019 - 2019
    Yipit (via Toptal)
    • Developed many projects for scraping publicly available data.
    • Rewrote old scrapers using a new approach to increase coverage.
    • Created tester functionality to compare different scraping script coverage.
    Technologies: Web Scraping, Data, Python
  • Machine Learning Developer

    2017 - 2019
    Neiron
    • Created a model for stock market price predictions.
    • Built a backtesting environment for predictions in a lean engine.
    • Developed a paper trading (simulation) system using an Interactive Brokers server and Python API.
    • Constructed a sentiment analysis tool to increase prediction accuracy.
    Technologies: Machine Learning, Python
  • Programmer

    2016 - 2017
    DoSo
    • Implemented different logic for insurance clients.
    • Developed and deployed a newer version of a reinsurance model to help clients.
    • Conducted tests for every model of insurance logic.
    • Developed a helper web application using ASP.NET MVC.
    Technologies: ASP.NET MVC, C#
  • Programmer

    2015 - 2016
    Bank of Georgia
    • Implemented different types of logic in SQL for internal use.
    • Developed an application in Java for bank employees.
    • Tested the bank's functionalities on the application and database side.
    Technologies: SQL, Java

Experience

  • Azure Monitoring System (via Toptal)

    We developed this system to monitor Azure users' activities using built-in and custom policy combinations to check compliance. In a case of non-compliance, the system would alert the owner to solve it. We could also do the enforcement on our end if needed. We used Azure Logic Apps to start, Functions to execute the logic, Queue Storage to communicate between functions, and Table to save the data.

  • Connector (via Toptal)

    We created this website to attach a VPC with an NLB and TG setup to the EC2 instance with no public access. Using the server this way was much safer. I focused mainly on the back end and implemented the entire pipeline of attaching and detaching in AWS Step Functions. We used React with an API Gateway on the front end, Jenkins for CI/CD, and Terraform for infrastructure as code.

  • AWS Monitoring System (via Toptal)

    This system allowed a root user to monitor the activities of subusers. It ran at a time interval, collected subusers' activities, and enforced compliance if needed. We used several Lambda functions, SQS for communication, DynamoDB for saving the data, Elasticsearch for logs, Jenkins for CI/CD, and Terraform for infrastructure as code.

  • Public Data Scraper

    This project consisted of scraping all kinds of publicly available data using Scrapy. I created the pipeline and all the different spiders for different data sources. The main challenge for static websites was parsing different pages; for dynamic websites, it was simulating a JavaScript request that loaded the data. We used Scrapinghub (now Zyte) for deployment and AWS S3 with DynamoDB to save the data.

  • Data Analysis and Prediction System (via Toptal)

    In this project, I created a data analyzer and prediction system for different kinds of data using various tools. Then I created an API using Flask-RESTful and wrapping the system mentioned above into it.

  • RedString Bookmarking System (via Toptal)

    This is a bookmarking system that saves URLs and their locations. We accomplished this by using Google's geolocation and weather data that was scraped and analyzed using the IBM text analyzer. We used Flask deployed on AWS Elastic Beanstalk as the primary technology.

  • Email Receiver/Parser System (via Toptal)

    This system can receive an email, parse it, and call different APIs depending on its content. We deployed it on AWS using SES for notifications, S3 for saving the emails, RDS for saving logs, and Lambda for parsing and calling the API.

  • Deep Web Crawler (via Toptal)

    This project focused on scraping the deep web to get medicinal information using Python Scrapy. I created and fixed crawlers along with the pipeline. The most challenging thing to handle in this project was the scale.

  • Building Tuner (via Toptal)

    This project made it easier to manage university buildings because a large set of equipment constantly updated the building data. I created a database to store all the data and built data analysis mechanisms on top of that to process all the information. I also wrapped all that into a Python Dash web application and deployed it on Heroku.

  • Arabic Dialect Recognizer

    This project analyzed and classified Arabic speech. Working on it was challenging, mainly because I don't speak Arabic. I had to do some extra work to correctly evaluate the different models and understand which was better.

  • LinkedIn Scraper

    I have created many different LinkedIn scrapers during my career. Some gather personal and company data using advanced search features with the best result filtering. I have also implemented account rotation to avoid bans and automated the process by deploying crawlers on AWS.

  • Sentiment Analysis Tool

    I created this sentiment analysis tool to increase the accuracy of stock market price predictions. It checks for tweets, analyzes them using Word2Vec and a CNN architecture model, and outputs the sentiment. This project was very challenging!

  • Stock Market Price Predictor

    I created a model for stock market price predictions. Then I added backtesting using a lean engine. Finally, I set up an Interactive Brokers (IB) server and created an environment for paper trading (simulation) using the IB Python API.

  • Badminton Scraper and API

    I created a scraper for badminton live scores, matches, draws, and players, and I built an API on top of that. The whole project was scheduled and deployed on Amazon EC2. This was one of the biggest scraping projects I've worked on.

  • Reinsurance Company Project

    This was a huge project for reinsurance companies. I added full reinsurance functionalities in C#. The most difficult challenge was understanding the complex insurance logic and creating a reusable code.

  • Torrent Client
    https://github.com/BartholomewKuma27/TorrentClient

    In this project, I implemented a BitTorrent protocol that allowed users to create their own torrent clients. This was one of my first projects where I used Python and applied my knowledge of networking.

  • HTTP Server
    https://github.com/BartholomewKuma27/HttpServer

    This was an HTTP server implementation written in C. Working on this project was challenging and interesting because I had to implement almost everything from scratch, which expanded my technology experience.

Skills

  • Languages

    Python, SQL
  • Frameworks

    Flask, Selenium, Scrapy
  • Libraries/APIs

    Requests, Beautiful Soup, Pandas, Scikit-learn, Keras
  • Tools

    Git, PyCharm, VS Code, Jenkins, Terraform, Boto 3
  • Platforms

    Azure, Windows, Linux
  • Storage

    PostgreSQL, AWS DynamoDB, Azure Table Storage, MongoDB
  • Other

    Web Scraping, AWS, Machine Learning

Education

  • Bachelor's Degree in Math and Computer Science
    2013 - 2017
    Free University of Tbilisi - Tbilisi, Georgia

To view more profiles

Join Toptal
Share it with others