Mate Bersenadze, Developer in Tbilisi, Georgia
Mate is available for hire
Hire Mate

Mate Bersenadze

Verified Expert  in Engineering

Software Developer

Location
Tbilisi, Georgia
Toptal Member Since
October 31, 2022

Mate is an experienced software engineer with a passion for high-performance solutions. He has expertise in programming languages and tools like Python, FastAPI, AWS, and cloud engineering. He has worked on projects in finance, healthcare, and eCommerce. Mate's strong attention to detail and commitment to quality make him a valuable asset to any team. He thrives in a fast-paced environment and is committed to staying up-to-date with the latest technologies to deliver cutting-edge solutions.

Portfolio

Conversant AI, Inc.
Python, Flask, Amazon Web Services (AWS), Docker, APIs, Caching, FastAPI
Divio AG
Python, Django, REST, Linux, Docker, Git
Ekwithree
Python, Data Engineering, Web Scraping, PyCharm

Experience

Availability

Full-time

Preferred Environment

Python 3, FastAPI, Flask, Python, REST APIs, Amazon Web Services (AWS), Docker

The most amazing...

...project I've developed is a professional verification service that can handle a high load of verifications in parallel.

Work Experience

Back-end Python Developer via Toptal

2023 - 2024
Conversant AI, Inc.
  • Fixed bugs, refactored some parts of the codebase, and cleaned some terrible code.
  • Implemented an aggregation service that helped the client show their product to investors.
  • Fixed a deployment system that was a massive blocker for the company to grow. Fixed database problems.
Technologies: Python, Flask, Amazon Web Services (AWS), Docker, APIs, Caching, FastAPI

Python and Django Developer

2023 - 2023
Divio AG
  • Refactored and migrated legacy code to a new API, enhancing stability and scalability for a cloud management service provider.
  • Wrote tests for new endpoints and fixed tests for older ones.
  • Took part in rectifying the company's existing bugs and contributed to improvements.
Technologies: Python, Django, REST, Linux, Docker, Git

Python Web Scraper via Toptal

2023 - 2023
Ekwithree
  • Created crawlers to crawl over four million websites and collect data about the about page, contact information, address info, etc.
  • Stored data into Elasticsearch and created Elasticsearch indexes with analyzers and tokenizers, which would have great searching scores.
  • Ran a crawler on multiple servers to boost performance.
Technologies: Python, Data Engineering, Web Scraping, PyCharm

AWS Back-end Engineer | Python Developer

2022 - 2023
Elysium Health
  • Developed REST APIs using FastAPI deployed on API Gateway through Magnum.
  • Linked Fitbit and Garmin accounts to our service and pulled data from these accounts.
  • Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.
  • Implemented continuous integration and continuous deployment (CI/CD) pipelines using tools like CircleCI to automate the build, test, and deployment process.
Technologies: Back-end, Amazon Web Services (AWS), Serverless, AWS Lambda, OAuth 2, OAuth, Garmin API, Fitbit API, Node.js, Lambda Functions, Lambda Architecture, Databases, Containerization, Pytest, PyCharm, FastAPI, REST APIs, APIs

Senior Software Engineer | Python Developer

2022 - 2022
MaxinAI
  • Designed and developed REST APIs for various web applications using Python and FastAPI. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.
  • Implemented CI/CD pipelines using tools like CircleCI or GitLab CI to automate the build, test, and deployment process. Worked with AWS services to deploy and scale applications in the cloud.
  • Modified the code architecture and refactored code to match current Python standards. Optimized API performance by implementing caching, load balancing, and distributed computing solutions.
  • Worked with databases like PostgreSQL or MongoDB to design and implement data models and data access layers for APIs. Collaborated with front-end developers to integrate APIs into web and mobile applications.
Technologies: Python, API Gateways, FastAPI, PostGIS, APIs, REST APIs, API Integration, Linux, SQL, Data Scraping, Web Scraping, Technical Leadership, Cloud, Leadership, DevOps, Elasticsearch, Amazon Web Services (AWS), FFmpeg, AWS Lambda, Video Streaming, MySQL, Early-stage Startups, Discord Bots, CI/CD Pipelines, Test Automation, Back-end, Serverless, OAuth 2, OAuth, Python Dataclasses, Pydantic, API Architecture, API Applications, Redis, SDKs, Lambda Functions, Lambda Architecture, Databases, Containerization, Pytest, Docker, PyCharm

Senior Software Engineer | Python Developer

2020 - 2022
Mesh
  • Led the development of REST API applications from scratch using Python and Flask; ensured adherence to coding standards, best practices, and design patterns.
  • Collaborated closely with the product manager and other stakeholders to gather requirements, define the project scope, and establish priorities for development sprints.
  • Designed and implemented an automated testing strategy that included unit tests, integration tests, and end-to-end tests, ensuring application quality and reliability.
  • Implemented a CI/CD pipeline using CodePipeline, allowing for efficient and automated deployment of applications.
  • Collaborated with the front-end development team to ensure seamless integration of APIs into web and mobile applications, improving user experience and overall performance.
  • Built an entire scraping architecture based on asyncio and aiohttp. Served as a team lead and discussed demos and project improvements regularly with the CEO.
Technologies: Python, Flask, APIs, REST APIs, API Integration, Linux, Data Scraping, Web Scraping, Software Architecture, Technical Leadership, Cloud, Leadership, Architecture, DevOps, Elasticsearch, Amazon Web Services (AWS), AWS Lambda, Selenium, Early-stage Startups, Bots, CI/CD Pipelines, Test Automation, Back-end, Serverless, OAuth 2, OAuth, Python Dataclasses, Pydantic, API Architecture, API Applications, Celery, Redis, Lambda Functions, Lambda Architecture, Databases, Containerization, Pytest, Docker, PyCharm

Software Engineer | Python Developer

2020 - 2020
MaxinAI
  • Influenced database scraping and created a scalable scraping system for social media crawling. Scraped over 50 million users' data from Instagram, Facebook, YouTube, and Twitch.
  • Designed and developed REST APIs for various web applications using Python and Flask. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.
  • Developed an algorithm identifying the same users throughout different social media platforms. Created an algorithm for detecting bots across all social media.
  • Implemented CI/CD pipelines using tools like Jenkins, CircleCI, or GitLab CI to automate the build, test, and deployment process.
Technologies: Python, Scrapy, Flask, REST APIs, APIs, Cloud, Architecture, Software Architecture, MongoDB, Elasticsearch, Neo4j, Amazon Web Services (AWS), AWS Lambda, Microservices, Selenium, MVC Frameworks, MySQL, Bots, CI/CD Pipelines, Test Automation, Django, Back-end, Serverless, OAuth 2, OAuth, Python Dataclasses, API Architecture, API Applications, Celery, Redis, SDKs, PIP, Software Packaging, Databases, Containerization, Pytest, Docker, PyCharm

Software Engineer | Python Developer

2020 - 2020
MaxinAI
  • Developed REST endpoints for calling machine learning (ML) models on live data for Amazon products' allergen checking.
  • Designed an ML pipeline system for food label validation and compliance that uses several neural network models to receive images of labels and extract required information, such as nutrition facts, allergens, ingredients, and weight.
  • Created a pipeline accuracy evaluation system to measure end-to-end and each step's accuracy.
  • Implemented unit tests and test-driven development.
Technologies: Python, Scrapy, Flask, APIs, REST APIs, Cloud, Elasticsearch, MongoDB, Pandas, OCR, Architecture, Software Architecture, Amazon Web Services (AWS), AWS Lambda, Microservices, Selenium, MVC Frameworks, MySQL, Bots, CI/CD Pipelines, Django, Back-end, Databases, Containerization, Pytest, Docker, PyCharm

Software Engineer | Python Developer

2019 - 2020
MaxinAI
  • Designed and developed REST APIs for various web applications using Python and Flask. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.
  • Implemented CI/CD pipelines using tools like Jenkins, CircleCI, or GitLab CI to automate the build, test, and deployment process.
  • Worked with AWS services like EC2, S3, RDS, and Lambda to deploy and scale applications in the cloud. Developed and maintained monitoring and logging solutions using tools like the CloudWatch stack to ensure high availability and performance.
  • Created a system for scraping new and popular events from websites such as Eventbrite and Ticketmaster.
  • Automated a scraping process using cron jobs and created a REST API for calling and managing spiders.
Technologies: Scrapy, Python, Elasticsearch, Pandas, Cron, Flask, APIs, REST APIs, Amazon Web Services (AWS), AWS Lambda, Selenium, MVC Frameworks, MySQL, CI/CD Pipelines, Back-end, Databases, Containerization, Pytest, Docker, PyCharm

Software Engineer | Python Developer

2019 - 2019
MaxinAI
  • Designed and developed REST APIs for various web applications using Python and Flask. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.
  • Implemented CI/CD pipelines using tools like Jenkins, CircleCI, or GitLab CI to automate the build, test, and deployment process.
  • Worked with AWS services like EC2, S3, RDS, and Lambda to deploy and scale applications in the cloud. Developed and maintained monitoring and logging solutions using tools like CloudWatch or ELK stack to ensure high availability and performance.
  • Improved application security by implementing authentication and authorization mechanisms like OAuth 2 or JWT tokens. Worked with databases like PostgreSQL, MySQL, or MongoDB to design and implement data models and data access layers for APIs.
  • Created an MVC project using web2py and raw JavaScript on the front end. Collaborated with front-end developers to integrate APIs into web and mobile applications.
  • Managed both front-end and back-end tasks, supporting later bug fixes.
Technologies: Python, APIs, REST APIs, Flask, Web2py, PostgreSQL, Elasticsearch, Amazon Web Services (AWS), AWS Lambda, MySQL, Telegram Bots, Back-end, Docker, PyCharm

Software Engineer | Python Developer

2018 - 2019
MaxinAI
  • Designed and developed REST APIs for various web applications using Python and Flask. Built automated test suites for API endpoints, unit tests, and integration tests to ensure code quality and reliability.
  • Implemented CI/CD pipelines using tools like CircleCI or GitLab CI to automate the build, test, and deployment process. Worked with AWS services to deploy and scale applications in the cloud.
  • Developed and maintained monitoring and logging solutions using tools like CloudWatch to ensure high availability and performance. Improved application security by implementing authentication and authorization mechanisms like OAuth 2 or JWT tokens.
  • Contributed to a scraping and parsing system for the United States 50 states' laws using Python Scrapy.
  • Worked with databases like PostgreSQL, MySQL, or MongoDB to design and implement data models and data access layers for APIs. Collaborated with front-end developers to integrate APIs into web and mobile applications.
Technologies: Python, Scrapy, Flask, APIs, REST APIs, API Integration, Linux, SQL, Data Scraping, Web Scraping, Cloud, Elasticsearch, Amazon Web Services (AWS), AWS Lambda, MySQL, Back-end, Docker, PyCharm

Mesh - Verifying Professionals

http://www.mesh.id
I was part of a team that developed a service for a software company to improve the process of verifying professionals. Our goal was to create a system that would help customers quickly verify their professional licenses without contacting support directly. To achieve this, we developed an on-demand live crawler crawling over 100 websites and could understand customer queries and return license information.

To ensure the service was reliable and scalable, we deployed it on a cloud platform and set up a monitoring system to track performance and quickly identify any issues. We also regularly reviewed customer feedback and improved the service responses based on user input.

We decided to use a REST API to provide users with easy access to the service, which required me to build a robust back-end system in Python using the FastAPI web framework.

We collaborated closely as a team throughout the development process, leveraging cutting-edge technologies to create a high-quality product.

To ensure that the system was reliable and efficient, I performed extensive testing and evaluation on the data processing pipeline, including testing for edge cases and performance issues.

REST API for Climate-related Project

https://www.climate-x.com/
As a Python developer, I was involved in creating a REST application for a climate-related company that allowed their users to detect losses in building costs related to climate change, like flooding, heating, etc. Additionally, I played a crucial role in restructuring and refactoring the architecture of the existing code to improve its efficiency and scalability.

To start, I worked closely with the company's product team to understand the requirements and goals of the new application. We decided to use a REST API to provide users with easy access to the data, which required me to build a robust back-end system in Python using the FastAPI web framework.

After completing the initial development, I began restructuring the existing codebase to improve its efficiency and scalability. This involved identifying areas of the code that were slowing down the application and rewriting them to be more performant. Additionally, I worked to simplify the codebase and remove any unnecessary dependencies, which helped to make the code more maintainable and easier to work with.

Through this process, I was able to significantly improve the performance of the application and make it easier to add new features and functionality.

Juststream | Video Streaming Platform

I am the founder of Juststream.live video streaming platform. The serverless video streaming platform built on AWS is a highly scalable and performant platform that had 500,000 monthly users.

The platform is designed using a serverless architecture, which allowed me to focus on building features and functionality rather than managing servers. The platform leveraged AWS services such as Lambda, API Gateway, AWS Media Convert, Elastic Load Balancer, and CloudFront to provide users with a highly secure, reliable, and low-latency video streaming experience. I have also implemented various monitoring and logging solutions to ensure the platform's performance and health were constantly checked.

Overall, the project was an impressive achievement that demonstrated the power of serverless architecture and the capabilities of AWS services.

Big Data Collection and Management for a Social Media Platform

As a Python developer, I was tasked with writing a REST API and a Big Data crawler to collect more than 100 million user information data from different social media platforms for a marketing research company.

To start, I researched the different social media platforms and identified the necessary data points relevant to the marketing research project. I then wrote a Big Data crawler in Python that could collect this data from multiple sources, process it, and store it in a scalable database.

To ensure that the data was collected efficiently and reliably, I designed the crawler to run on multiple servers in parallel to ensure that the crawler could handle a large volume of data in a timely manner.

Once the data was collected and processed, I wrote a fast and reliable REST API in Python to allow the marketing research company to access and analyze the data easily.

To deploy the project on AWS, I used Amazon's Elastic Beanstalk service. This helped ensure the project could handle a large volume of traffic and data.

I worked closely with the marketing research company to ensure that the project met their needs and goals. I also performed testing to ensure that the system was reliable and secure.

US Statutes and Laws

As a Python developer, I took part in building a scalable web crawler to collect US statutes and laws from various legal websites. This involved designing a crawler architecture that could handle large amounts of data and was efficient in its execution.

To begin, I worked with the product team to identify the websites that contained the desired legal data and then created a crawler using Python libraries such as Scrapy and Selenium to scrape the web pages and store the data in a format that could be easily used in further processing.

To ensure that the crawler was efficient and scalable, I deployed it using AWS. This allowed us to process large amounts of data quickly and reliably.

Once the crawler was complete, I created a REST API in Flask that would provide authorized users access to the collected data.

I designed the API to be fault-tolerant, with data replication and load balancing, to ensure it could handle high traffic levels without downtime.

Throughout the development process, I worked closely with the product team to ensure that the final product met their requirements and goals. I performed testing and monitoring to ensure that the system was reliable and secure.

Allergen Checking on Products Using ML

As a Python developer, I built a pipeline for a software engineering project that checked for products' allergens and other information in their description on Amazon pages. The pipeline consisted of data extraction, cleaning, preprocessing, and deployment.

I researched the necessary data points to extract from the Amazon product pages. I used Python libraries such as Requests, Selenium, and Pandas to extract the data from Amazon product pages and clean it for further processing.

After cleaning the data, I developed a robust system that could efficiently process large datasets of product descriptions to identify the presence of allergens and other key information.

Once the pipeline was developed and tested, I deployed it in a REST API using Flask, which allowed users to enter the Amazon product URL and retrieve information on the presence of allergens and other key information related to the product's ingredients and nutritional facts.

Throughout the development process, I worked closely with the product team to ensure the system met their requirements and goals. I also performed testing and monitoring to ensure the system was reliable and secure.

Languages

Python, SQL, XML, Python 3

Frameworks

Scrapy, Flask, Selenium, OAuth 2, Django, Web2py

Libraries/APIs

REST APIs, Pydantic, Pandas, FFmpeg, Garmin API, Fitbit API, Node.js, Vue 2

Tools

PyCharm, Pytest, Celery, Cron, Amazon Cognito, AWS IAM, Amazon Elastic Container Service (Amazon ECS), AWS CloudFormation, Git

Paradigms

Test Automation, API Architecture, Lambda Architecture, Microservices, DevOps, REST

Platforms

Docker, Linux, Amazon Web Services (AWS), AWS Lambda, Amazon EC2, AWS Elastic Beanstalk

Storage

Elasticsearch, MongoDB, PostgreSQL, MySQL, Redis, Databases, PostGIS, Neo4j, Amazon DynamoDB, Amazon S3 (AWS S3)

Other

FastAPI, APIs, API Integration, Data Scraping, Web Scraping, Cloud, Early-stage Startups, Bots, CI/CD Pipelines, Back-end, Serverless, OAuth, Python Dataclasses, API Applications, Lambda Functions, Algorithms, Data Structures, API Gateways, Software Architecture, Technical Leadership, Leadership, Architecture, MVC Frameworks, Telegram Bots, Discord Bots, SDKs, PIP, Software Packaging, Containerization, OCR, Video Streaming, Data Engineering, Amazon API Gateway, Front-end, Caching

2015 - 2019

Bachelor's Degree in Computer Science

Ivane Javakhishvili Tbilisi State University - Tbilisi, Georgia

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring