Irakli Gugushvili, Web Scraping Developer in Tbilisi, Georgia
Irakli Gugushvili

Web Scraping Developer in Tbilisi, Georgia

Member since April 5, 2018
In his career, Irakli has worked on numerous projects using a variety of programming languages, but ever since he became interested in artificial intelligence, he's focused on machine learning. He has a solid skill base in Python and related technologies which come in handy in his current job as an ML engineer. Irakli joined Toptal to find projects where he can solve interesting and challenging problems using machine learning techniques.
Irakli is now available for hire

Portfolio

Experience

Location

Tbilisi, Georgia

Availability

Part-time

Preferred Environment

Git, Jupyter Notebook, PyCharm, Windows

The most amazing...

...thing I've developed was my stock price prediction model using LSTM.

Employment

  • Senior Python Engineer | AWS Architect

    2019 - PRESENT
    Stauch Technologies (via Toptal)
    • Developed the system where the root user would monitor sub-users' AWS activity.
    • Created the website for attaching private VPC to an EC2 server that has no public access.
    Technologies: Amazon Web Services (AWS), AWS, Python
  • Web Scraping Expert

    2019 - PRESENT
    Explorium
    • Developed many projects for scraping various publicly available data using scraping technologies including Scrapy, Selenium, Requests, and BeautifulSoup.
    Technologies: Web Scraping, Data, Python
  • Python Developer

    2019 - 2020
    Treehouse Technology Group (via Toptal)
    • Developed an email receiver/parser system deployed on AWS. It receives an email, parses its body, and calls different APIs depending on its content.
    • Created a data analyzer API and a data prediction API.
    Technologies: Amazon Web Services (AWS), AWS, Flask, Python
  • Data Scraping Engineer

    2019 - 2019
    Yipit (via Toptal)
    • Developed many projects for scraping various publicly available data.
    Technologies: Web Scraping, Data, Python
  • Machine Learning Developer

    2017 - 2019
    Neiron
    • Created a model for stock market price predictions.
    • Built a backtesting environment for predictions in a lean engine.
    • Developed a paper-trading system using an interactive brokers' server and IB Python API.
    • Constructed a sentiment analysis tool to increase prediction accuracy.
    Technologies: Machine Learning, Python
  • Programmer

    2016 - 2017
    DoSo
    • Installed small logic for insurance clients.
    • Implemented a reinsurance model.
    • Conducted tests for every model of insurance logic.
    • Developed a web application using ASP.NET MVC.
    Technologies: ASP.NET MVC, C#
  • Programmer

    2015 - 2016
    Bank of Georgia
    • Implemented different types of logic in SQL.
    • Developed an application using Java.
    • Tested the bank's functionalities.
    Technologies: SQL, Java

Experience

  • Connector (via Toptal) (Development)

    The website's goal was to have the ability to attach VPC with NLB and TG setup to the EC2 instance that has no public access. Using the server in such a way was much safer. I mainly focused on the back end and implemented the whole pipeline of attaching and detaching in Step Functions. We used React with an API Gateway on the front end, Jenkins for CI/CD, and Terraform for infrastructure as code.

  • Monitoring System (via Toptal) (Development)

    The project was about creating a system that would allow the root user to monitor the activities of the sub-users. It would run with a time interval, collect sub-users' activity, and enforce them if needed. We used several Lambda functions using SQS for communication, DynamoDB for saving the data, Elasticsearch for logs, Jenkins for CI/CD, and Terraform for infrastructure as code.

  • Public Data Scraper (Development)

    The project consisted of scraping all kinds of publicly available data using Scrapy. I created the pipeline along with all the different spiders for different data sources. For static websites the main challenge was parsing different pages; for dynamic websites, it was to simulate a JavaScript request that loads the data. We used ScrapingHub for the deployment and AWS S3 with DynamoDB for saving the data.

  • Data Analysis and Prediction System (via Toptal) (Development)

    The project was about creating a data analyzer and predictor for different kinds of data using various tools, then creating an API using Flask-RESTful and wrapping the mentioned system into it.

  • RedString Bookmarking System (via Toptal) (Development)

    The project created a bookmarking system that saves not only the URL but also the location. We accomplished this by using Google's geolocation and weather data (scraped and analyzed data) using the IBM text analyzer. The main technology used was Flask deployed on AWS Elastic Beanstalk.

  • Email Receiver/Parser System (via Toptal) (Development)

    This project created a system that is available to receive an email, parse it, and call different APIs depending on its content. It was deployed on AWS using SES for notification, S3 for saving the email, RDS for saving logs, and Lambda for parsing and calling the API.

  • Deep Web Crawler (via Toptal) (Development)

    This project focused on scraping the deep web to get medicinal information using Python Scrapy. I created and fixed crawlers along with the pipeline.

  • Building Tuner (via Toptal) (Development)

    This project made university building management easier. There was a large amount of equipment that constantly updated the data.

    I created a database side to store all the data and built data analysis mechanisms on top of that to process all the information. I also wrapped all that into a Python Dash web application and deployed it on Heroku.

  • Arabic Dialect Recognizer (Development)

    This project analyzed and classified Arabic speech.

  • LinkedIn Scraper (Development)

    Over the course of my career, I have created many different LinkedIn scrapers—ones that gather all kinds of personal and company data, sometimes using advanced search features with the best result filtering. I've also implemented account rotation in order to avoid bans and have automated the process by deploying crawlers on Amazon Web Services (AWS).

  • Sentiment Analysis Tool (Development)

    I created this sentiment analysis tool to increase stock market price predictor accuracy. It checks for tweets, analyzes them using Word2Vec and CNN architecture model, and outputs the sentiment.

  • Stock Market Price Predictor (Development)

    I created a model for stock market price predictions. Then I added backtesting using a Lean engine. Finally, I set up an interactive brokers server and created an environment for paper trading using the IB Python API.

  • Badminton Scraper and API (Development)

    This project created a scraper for badminton live scores, matches, draws, and players; I also built an API on top of that. The whole project was scheduled and deployed on AWS EC2. This was one of the biggest scraping projects I've worked on.

  • Reinsurance Company Project (Development)

    This was a huge project for reinsurance companies where I added full reinsurance functionalities in C#.

  • Torrent Client (Development)
    https://github.com/BartholomewKuma27/TorrentClient

    The project implemented a BitTorrent protocol that allowed users to create their own torrent client. It was one of my first projects where I used Python and utilized my knowledge of networking.

  • HTTP Server (Other amazing things)
    https://github.com/BartholomewKuma27/HttpServer

    This was an HTTP server implementation written in C.

Skills

  • Languages

    Python, SQL, Java, C#, JavaScript
  • Frameworks

    Flask, Selenium, Scrapy, ASP.NET MVC
  • Libraries/APIs

    Pandas, NumPy, Keras, Sklearn, Scikit-learn
  • Tools

    GitHub, PyCharm, Git, Jenkins, Terraform
  • Platforms

    Jupyter Notebook, Amazon Web Services (AWS), Windows, Linux, Heroku
  • Other

    Machine Learning, Deep Learning, Web Scraping, Data, AWS
  • Paradigms

    Agile
  • Storage

    PostgreSQL, MySQL, MongoDB, AWS DynamoDB

Education

  • Bachelor's degree in Math and Computer Science
    2013 - 2017
    Free University of Tbilisi - Tbilisi, Georgia

To view more profiles

Join Toptal
Share it with others