Romualdas Randamanskas, Developer in Vilnius, Lithuania
Romualdas is available for hire
Hire Romualdas

Romualdas Randamanskas

Verified Expert  in Engineering

Bio

Romualdas is a senior data engineer constantly interested in learning about new ETL technologies and best data analysis practices to deliver challenging projects. Experienced in developing efficient data pipelines for various sources and creating interactive analysis tools, he enjoys exploring open datasets to validate experimental data processing and analysis techniques to make data accessible for everyone. Romualdas is a highly skilled professional willing to embrace new projects.

Portfolio

RoyaltyRange
Apache Airflow, Python, Natural Language Processing (NLP)...
Danske Bank
Python, Spark, Data Quality, Data Governance, Data Lineage, Big Data...
Platforma LT, UAB
Dask, Python, MongoDB, Azure, GPU Computing, Flask, Plotly, ETL, Data Modeling...

Experience

  • Python - 4 years
  • Data Quality - 3 years
  • CI/CD Pipelines - 3 years
  • Apache Airflow - 3 years
  • Spark - 3 years
  • PostgreSQL - 2 years
  • Google Cloud - 2 years
  • Dask - 2 years

Availability

Part-time

Preferred Environment

Linux, Google Cloud, Amazon Web Services (AWS), Windows, Python, Visual Studio Code (VS Code), Apache Airflow, Spark, Pandas, Dask

The most amazing...

...project I've developed is a B2B event data platform in which I've used Google Cloud Platform services like Cloud SQL, Big Query, and Cloud Data Fusion.

Work Experience

Senior Data Engineer

2022 - 2022
RoyaltyRange
  • Maintained and improved data acquisition and storing procedures.
  • Designed and developed effective data models, ensuring data quality and clarity.
  • Established various services using DevOps methodology and integrated them into the CI/CD pipeline.
Technologies: Apache Airflow, Python, Generative Pre-trained Transformers (GPT), Natural Language Processing (NLP), Data Quality, Web Scraping, PostgreSQL, Data Engineering, ETL Tools, SQL, Data Modeling, Big Data, Azure, ETL

Senior Data Engineer

2021 - 2022
Danske Bank
  • Deployed machine learning models and delivered end-to-end data science solutions within the commercial, risk, compliance, and fraud domains.
  • Cooperated with data scientists and MLOps engineers in setting the data engineering direction.
  • Built the data pipelines for the statistical models using the Hadoop ecosystem, including Spark, NoSQL, and HDFS.
Technologies: Python, Spark, Data Quality, Data Governance, Data Lineage, Big Data, Data Engineering, ETL Tools, DevOps, CI/CD Pipelines, Apache Airflow, SQL, Amazon Web Services (AWS), Data Modeling, ETL

Data Engineer

2020 - 2021
Platforma LT, UAB
  • Built fast and efficient data process pipelines with Python and Dask.
  • Designed complex data structures with MongoDB and Azure Table storage.
  • Conducted time series data analysis and predictive modeling with Pandas, NumPy, and Scikit-learn.
  • Created interactive web dashboards with Flask and Plotly.
Technologies: Dask, Python, MongoDB, Azure, GPU Computing, Flask, Plotly, ETL, Data Modeling, PostgreSQL

Digital Systems Engineer

2019 - 2020
AKKA Benelux
  • Developed data analytics algorithms in Python to predict landing gear system failures by identifying target aircraft systems and related subcomponents.
  • Identified potential system failures, directed maintenance efforts, and monitored the system performance over time.
  • Compiled technical reports for stakeholders explaining objectives, KPIs, and the analysis results.
Technologies: Python, Spark, Big Data, Systems Engineering, Predictive Modeling, Technical Reports, SQL

Data Engineer

2018 - 2019
AKKA Benelux
  • Analyzed data to find problem statements and deliver possible solutions using machine learning.
  • Researched new machine learning frameworks and developed proof-of-concept prototypes.
  • Built data-driven web applications and deployed them on AWS with Python and JavaScript.
Technologies: Python, Big Data, Data Engineering, Machine Learning, Databases, JavaScript, Amazon Web Services (AWS), Data Modeling, SQL

Front-end Developer

2017 - 2018
Kemdu
  • Got proficient in front-end fundamentals and frameworks.
  • Implemented cross-browser testing and handled compatibility issues with custom JavaScript code.
  • Developed custom themes for new and existing WordPress web pages.
Technologies: PHP, WordPress, CSS, HTML, JavaScript, jQuery

Experience

Projects for Deep Learning Nanodegree

https://github.com/romran/DLND-projects
Took part in the PyTorch Scholarship Challenge from Facebook, where only the top 300 students from the initial Challenge Course were selected for the Deep Learning Nanodegree program.

Syllabus:
• Neural Networks
• Convolutional Neural Networks
• Recurrent Neural Networks
• Generative Adversarial Networks
• Model deployment with AWS SageMaker.

Projects for Deep Reinforcement Learning Nanodegree

https://github.com/romran/DRLND-projects
The Facebook Deep Learning Scholarship program Phase-3 also awarded 200 scholarships to the Deep Reinforcement Learning Nanodegree program in which I participated.

Syllabus:
Foundations of Reinforcement Learning
• Value-Based Methods
• Policy-Based Methods
• Multi-Agent Reinforcement Learning

Projects for Cloud Native Application Architecture Nanodegree Program

https://github.com/romran/CNAAND-projects
In this program, as a student, I've learned how to run and manage scalable applications in a cloud-native environment using open source tools and projects like ArgoCD, gRPC, and Grafana. And I've also learned how to identify the best application architecture solutions for an organization's needs, design a microservice architecture by leveraging cloud-native tools and patterns, implement best practices in Kubernetes security, and use dashboards to diagnose, troubleshoot, and improve site reliability.

Syllabus:
• Cloud Native Fundamentals
• Message Passing
• Observability
• Microservices Security

NASA QuakeHunter – Earthquakes Activity Analysis Update

https://2022.spaceappschallenge.org/challenges/2022-challenges/earth-data-analysis-developers-wanted/teams/eqsight/project
Participated in the NASA International Space Apps Challenge as part of the EQsight team project. The team that worked on the update of QuakeHunter, the earthquakes activity analysis app.

The EQsight noticed that valuable earthquake catalog API data was openly provided by USGS and decided to use it to upgrade the existing earth data visualization app QuakeHunter. We used GeoJSON data once it could be used as a point of reference for advanced predictive models, anomaly detection, and distributional analysis. After reviewing the QuakeHunter repository, we focused on refactoring the earthquakes activity time-series graphic, updating the algorithm to correctly count daily earthquakes for a given coordinate point and radius. Additionally, we added a feature to find the maximum daily magnitude on the Richter scale to test the hypothesis that an increased number of earthquakes could distinguish higher magnitude earthquakes.

Education

2014 - 2015

Bachelor’s Degree (Erasmus) in Computer Science

Vienna University of Technology - Vienna, Austria

2014 - 2014

Bachelor’s Degree (Erasmus) in Computer Science

Faculty of Engineering – University of Porto - Portugal, Porto

Certifications

DECEMBER 2021 - DECEMBER 2024

AWS Certified Cloud Practitioner

Amazon Web Services (AWS)

DECEMBER 2021 - PRESENT

Cloud Native Application Architecture | Nanodegree Program

Udacity

NOVEMBER 2019 - PRESENT

Deep Reinforcement Learning for Enterprise

Udacity

APRIL 2019 - PRESENT

Deep Learning

Udacity

JUNE 2018 - PRESENT

Mobile Web Specialist

Udacity

Skills

Libraries/APIs

Pandas, PyTorch, Dask, jQuery

Tools

Apache Airflow, Grafana, Plotly

Languages

Python, JavaScript, CSS, HTML, C++, PHP, SQL

Frameworks

Spark, gRPC, Flask

Platforms

Jupyter Notebook, Kubernetes, Apache Kafka, Linux, Windows, Visual Studio Code (VS Code), WordPress, Azure, Docker, Web, Amazon Web Services (AWS)

Storage

Google Cloud, PostgreSQL, Databases, MongoDB

Paradigms

Offline-first Development, Responsive Layout, Mobile Web Design, DevOps, ETL

Other

Data Quality, ETL Tools, CI/CD Pipelines, Argo CD, Deep Reinforcement Learning, Deep Learning, Machine Learning, Web Accessibility, AWS Certified Cloud Practitioner, Materials Science, Similarity-based Modeling (SBM), Digital Communication, Coding, Optimization, Computer Graphics, Computer Aided Software Engineering, Building & Construction, Big Data, Data Engineering, Systems Engineering, Predictive Modeling, Technical Reports, GPU Computing, Data Governance, Data Lineage, Natural Language Processing (NLP), Web Scraping, Neural Networks, Data, APIs, Source Code Review, Generative Pre-trained Transformers (GPT), Data Modeling

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring