Dr. Aleksandre Liluashvili, Developer in Frankfurt, Hessen, Germany
Dr. is available for hire
Hire Dr.

Dr. Aleksandre Liluashvili

Verified Expert  in Engineering

Solutions Architect and Developer

Location
Frankfurt, Hessen, Germany
Toptal Member Since
October 12, 2022

Aleksandre completed his PhD in theoretical physics in 2014. A significant part of his thesis was dedicated to developing numerical algorithms and runtime optimization. He then decided to change his career to coding, which he found to be his true passion. He has been coding for over five years, working on many projects for big German banks and primarily using Java, SQL, and Python. Aleksandre likes challenging tasks related to optimizing performance and delivering complex software on time.

Portfolio

Genesco - Main
SQL, Data Engineering, Java, Python, Oracle, JetBrains
Alteryx - Enterprise Applications IT Ops
SQL, Python, ETL, ETL Tools, PostgreSQL, Spring, Minimum Viable Product (MVP)...
IT company in banking industry
Java, SQL, Optimization, Microsoft SQL Server, Spring, Hibernate, JPA, Back-end...

Experience

Availability

Part-time

Preferred Environment

Windows, Linux, IntelliJ IDEA, PostgreSQL, Java, Python, PyCharm, Data, SQL, Amazon Web Services (AWS)

The most amazing...

...thing I've developed is a market data loading application (ETL), loading terabytes of data into an SQL database daily.

Work Experience

Data Engineer

2024 - 2024
Genesco - Main
  • Worked on automating the process of booking purchase order receipts into the ERP system using Python and an Oracle database.
  • Developed and rigorously tested highly complex SQL queries to effectively manage the matching of purchase order data, encompassing both customer and ERP datasets.
  • Extended the Python framework to facilitate bulk loading of large volumes of data into an Oracle database.
Technologies: SQL, Data Engineering, Java, Python, Oracle, JetBrains

SQL Developer

2023 - 2024
Alteryx - Enterprise Applications IT Ops
  • Designed and optimized the database structure to store and efficiently retrieve millions of records daily.
  • Created a Python-based web application and hosted it on AWS to be accessible for the whole company, making it easy to analyze the existing ETL workflows.
  • Built and extended the CI/CD pipelines in GitLab to make the data management system robust.
  • Improved the performance and quality of the Alteryx designer workflows.
Technologies: SQL, Python, ETL, ETL Tools, PostgreSQL, Spring, Minimum Viable Product (MVP), Amazon EC2, Amazon S3 (AWS S3), Amazon Web Services (AWS), CI/CD Pipelines, Alteryx, GitLab, PyCharm, Pandas, Git, Back-end, Pipelines, REST APIs, Cloud, Snowflake, Liquibase, Scrum, Data Migration, DevOps, JetBrains

Senior Back-end Developer

2023 - 2023
IT company in banking industry
  • Optimized the Microsoft SQL database to increase the general performance of the system by optimizing the table index structure.
  • Optimized the SQL queries used by the Hibernate interface to make the system much more performant.
  • Worked on a Java back-end application for tax declaration.
Technologies: Java, SQL, Optimization, Microsoft SQL Server, Spring, Hibernate, JPA, Back-end, IntelliJ IDEA, JetBrains

Python Developer

2023 - 2023
Global Logistics Company
  • Developed a Python application to load logistics data into an SQLite database.
  • Developed SQL queries and used the pandas DataFrame to perform data analysis.
  • Created reports and used Google Maps API to visualize the data.
Technologies: Python, Reports, CSV, CSV File Processing, Python Boolean, Boolean Search, Visualization, SQL, SQLite, Pandas, NumPy, Google Maps API, JetBrains

Architect

2022 - 2023
Exela Technologies
  • Developed a smart copying tool with a Java back end to load big amounts of data files into the archiving system. Optimized SQL queries and Java Database Connectivity (JDBC) to improve performance.
  • Used Python, NumPy, and pandas to analyze the statistics and generate reports for the operating department.
  • Used Git and Subversion (SVN) for version control and Gradle to make the building process as automatized as possible.
  • Tracked all tasks and subtasks developed by other developers in my team using Jira.
Technologies: Java, PostgreSQL, Bash, Jira, Linux, JDBC, Gradle, Data Engineering, Algorithms, SQL, Scrum, Databases, H2 Database, Optimization, IntelliJ IDEA, Windows, SQL Performance, Spring Core, IBM Db2, Microsoft Excel, CSV, JSON, Python, Git, Subversion (SVN), C#, Back-end, Spring Boot, Performance Tuning, Query Optimization, Multithreading, Data Architecture, Data Modeling, Stored Procedure, Python 3, PySpark, Spark, Data Pipelines, ETL, Pipelines, GitHub, Data, PL/SQL, Oracle Database, Data Visualization, Relational Databases, XML, APIs, REST, ETL Tools, SQL DML, Spring, Amazon Web Services (AWS), Amazon S3 (AWS S3), Amazon EC2, AWS Lambda, CI/CD Pipelines, REST APIs, Cloud, JetBrains

Senior IT Consultant

2018 - 2022
UCG United Consulting Group
  • Developed a back-end application (ETL) to load vast market data daily into a database. Optimized the queries and the data loading process to achieve high speeds.
  • Optimized the database structure to store huge amounts of data efficiently.
  • Used Tableau to generate reports for the operations department.
  • Used Jira as a reporting tool to track all the tickets and related tasks.
  • Worked on a document management software project and optimized the Java back-end and SQL queries to speed up the processes successfully.
Technologies: Java, IntelliJ IDEA, PostgreSQL, Oracle, Bash, Linux, JDBC, Gradle, Data Engineering, Algorithms, SQL, Scrum, Databases, Optimization, Windows, SQL Performance, Spring Core, Microsoft Power BI, Microsoft Excel, CSV, JSON, NumPy, Matplotlib, Git, HTML, C#, Back-end, Spring Boot, Performance Tuning, Query Optimization, Multithreading, Data Architecture, Data Modeling, Stored Procedure, Python 3, PySpark, Spark, Data Pipelines, ETL, Pipelines, JavaScript, GitHub, Data, PL/SQL, Oracle Database, Tableau, Data Visualization, Relational Databases, MySQL, Full-stack, XML, APIs, REST, ETL Tools, SQL DML, Spring, CI/CD Pipelines, REST APIs, Cloud, JetBrains

Data Engineer

2020 - 2020
UCG United Consulting Group
  • Created a database model for a trainer application. In the application, users could register, add questions, take a test, and check the final scores. All the data from the users was saved in the database, and the results were evaluated in reports.
  • Used Jira to track all open tickets and to monitor and evaluate time spent on each ticket.
  • Developed the application in a small scrum team, following the Scrum software development principles.
Technologies: SQL, PostgreSQL, Algorithms, Multithreading, APIs, Data Engineering, Databases, Data Architecture, Python 3, PySpark, Spark, Data Pipelines, ETL, Pipelines, JavaScript, GitHub, Data, PL/SQL, Oracle Database, Data Visualization, Relational Databases, XML, ETL Tools, SQL DML, Mathematics, IntelliJ IDEA, JetBrains

Quantitative Analyst

2018 - 2018
PwC Germany
  • Collaborated with other team members on a quantitative risk modeling software. The back end was developed in C# and C++.
  • Used SQL queries to analyze data and create reports for the customer.
  • Utilized Agile methods (Scrum and Kanban) to quickly adjust to the customer's needs.
Technologies: SQL, Statistics, C++, GitHub, Oracle Database, Relational Databases, XML, IntelliJ IDEA

PhD Student

2014 - 2017
German Aerospace Center
  • Developed a theory to describe bacteria movement in a dense environment.
  • Utilized a supercomputer to solve numerical algorithms and generate data for further analysis.
  • Analyzed and visualized terabytes of data using Python with Matplotlib, NumPy, and pandas.
  • Published the results in a renowned physics journal.
Technologies: Python 3, Data, Visualization, ETL, SQL, Matplotlib, NumPy, Pandas

Market Data Loader

A Java-based back-end software (ETL) that loads vast amounts of market data, including from Bloomberg, into an SQL database.

My contribution to this project involved optimizing and parallelizing the software to improve performance and using Java Database Connectivity (JDBC) to load data into the database. I designed and optimized the database model to efficiently handle huge amounts of data.

I worked on tuning SQL queries to speed up the application and articulated closely with the client in an agile environment to quickly adjust the requirements.

Smart Document API

A back-end application to copy huge amounts of data to different types of server systems. The application runs on Linux and Windows systems and utilizes an SQL-based database (PostgreSQL, H2, DB2, Oracle) to track all the transactions.

I acted as the architect and designed and implemented the back-end application and the database model. Further, I optimized queries to generate reports for the operating department.

Document Loader API

A Java-based back-end application to transfer thousands of documents (bills and workflows) in an archiving system. I used the Oracle database for later monitoring of the documents and their metadata. This is a highly parallelized application to remain responsive all the time.

Data Analysis Tool

During my time as a PhD student in physics, I implemented a Python-based analysis tool to evaluate terabytes of data on bacterial behavior in dense environments. I used NumPy, pandas, and Matplotlib libraries to analyze and visualize the data.

The results were presented at many conferences and published in renowned physics journals.

Logistics Data Automation

A Python-based application to load logistics data into an SQLite database and perform data analysis using the pandas DataFrame and SQL queries. The main purpose of the application was to create reports and use Google Maps API to visualize the data.

Data Management System

I designed the database structure for a data management platform to store and retrieve large amounts of data daily. The Alteryx designer was used as an ETL tool. I created a CI/CD process to improve its robustness and catch and log errors before they appear in production. I also automatized the deployment process.

AWS Back End

I developed the AWS back end to run the data management system.
It included dedicated Linux EC2 instances for the core calculations, Lambda functions, Amazon RDS for hosting the database, REST APIs to fetch logging information from the database, a load balancer to run internal web applications, and S3 to store and back up evaluated results.
2014 - 2017

PhD in Theoretical Physics

German Aerospace Center (DLR) - Cologne, Germany

2012 - 2014

Master's Degree in Physics

Heidelberg University - Heidelberg, Germany

2008 - 2012

Bachelor's Degree in Physics

Heidelberg University - Heidelberg, Germany

JULY 2021 - JULY 2024

AWS Certified Solutions Architect

Amazon Web Services

MAY 2021 - PRESENT

MTA: Introduction to Programming Using Python

Microsoft

AUGUST 2018 - PRESENT

Exam 70-483: Programming in C#

Microsoft

AUGUST 2018 - PRESENT

Professional Scrum Master I

Scrum.org

JUNE 2018 - PRESENT

Oracle Database SQL Certified Associate

Oracle

Languages

Python, Java, SQL, Bash, Stored Procedure, Python 3, SQL DML, Snowflake, HTML, C#, C++, JavaScript, XML

Libraries/APIs

JDBC, Pandas, NumPy, Matplotlib, Google Maps API, REST APIs, PySpark, Liquibase

Tools

LaTeX, IntelliJ IDEA, PyCharm, Git, JetBrains, Mathematica, Jira, Gradle, Microsoft Excel, Subversion (SVN), GitHub, GitLab, Microsoft Power BI, Tableau, Amazon CloudWatch, Amazon Simple Queue Service (SQS)

Paradigms

ETL, Scrum, REST, DevOps

Platforms

Windows, Alteryx, Linux, Oracle, Amazon Web Services (AWS), Oracle Database, Amazon EC2, AWS Lambda, Amazon Linux

Storage

PostgreSQL, Databases, Relational Databases, H2 Database, SQL Performance, JSON, PL/SQL, MySQL, SQLite, Amazon S3 (AWS S3), IBM Db2, Data Pipelines, Microsoft SQL Server

Other

Statistics, Calculus, Linear Algebra, Algorithms, Data Engineering, Back-end, Multithreading, Data, CSV File Processing, Mathematics, Numerical Methods, Optimization, Encryption, CSV, APIs, Performance Tuning, Query Optimization, Data Architecture, Data Modeling, Pipelines, Data Visualization, Full-stack, Visualization, ETL Tools, Reports, Python Boolean, Boolean Search, CI/CD Pipelines, Cloud, Data Migration, SFTP, Obfuscation, Data Encryption, Minimum Viable Product (MVP), Amazon RDS

Frameworks

Spring Boot, Spring, Spring Core, Spark, Hibernate, JPA

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring