Liviu Enescu, Developer in Bucharest, Romania
Liviu is available for hire
Hire Liviu

Liviu Enescu

Verified Expert  in Engineering

Data Engineer and Developer

Location
Bucharest, Romania
Toptal Member Since
October 10, 2022

Liviu is a software developer with more than 15 years of experience working in startups and large corporate environments. He focuses on data processing and ETL flows. Liviu specializes in data lakes in big data and Hadoop clusters, as well as in traditional data warehouses in relational databases.

Portfolio

nDimensional, Inc.
Scala, Kubernetes, Akka, Play, Spark, Flink, Apache Kafka...
A Top European Bank
Scala, Apache Spark, Impala, Apache Hive, SQL, Linux, Hadoop, Cloudera, Bash...
Freelance
Scala, Apache Spark, Docker, Python, SQL, ETL, Akka, PostgreSQL, Bash, Unix...

Experience

Availability

Full-time

Preferred Environment

Git, IntelliJ IDEA, Linux

The most amazing...

...project I've worked on is an integrated data platform that can be used for data ingestion, transformation, and analytics.

Work Experience

Scala Developer

2022 - 2023
nDimensional, Inc.
  • Took a central role in a startup environment, focusing on upgrading Scala and related frameworks (Akka, Play, Spark, and Flink) to newer versions for a data analytics product.
  • Participated and contributed actively to architectural meetings for migrating the current monolithic-based application to a microservices architecture.
  • Collaborated closely with the team to understand project requirements and deliverables, actively participating in code reviews and providing valuable insights.
Technologies: Scala, Kubernetes, Akka, Play, Spark, Flink, Apache Kafka, Amazon Web Services (AWS)

Senior Big Data Developer

2022 - 2023
A Top European Bank
  • Built various data processing pipelines capable of handling large volumes of data using Scala and Apache Spark frameworks. The pipelines were deployed in a Cloudera Hadoop distribution.
  • Optimized the existing Apache Hive and Impala infrastructure to increase the platform's stability and enable current processes to handle massive data volumes in limited time frames.
  • Participated actively in architectural meetings, proposing alternative and more efficient solutions.
Technologies: Scala, Apache Spark, Impala, Apache Hive, SQL, Linux, Hadoop, Cloudera, Bash, Shell Scripting, Spark, Oracle, PL/SQL

Scala Developer

2020 - 2022
Freelance
  • Developed the back end module of a data science platform using CQRS and Event Sourcing patterns.
  • Implemented end-to-end testing for various platform components and endpoints.
  • Built a highly configurable data generator to handle and generate large volumes of data needed for platform performance testing.
Technologies: Scala, Apache Spark, Docker, Python, SQL, ETL, Akka, PostgreSQL, Bash, Unix, Microservices, CQRS, Event Sourcing, Tox, E2E Testing, Unit Testing, Integration Testing, Jupyter Notebook, Shapeless, Monix, Web Services, Swagger, JDBC, JSON, APIs

Senior Big Data Developer

2016 - 2022
UniCredit
  • Developed Scala and Apache Spark data processing and ETL pipelines in a big data environment based on a Cloudera Hadoop distribution.
  • Migrated an existing data quality solution from Teradata to big data technologies (Scala and Apache Spark).
  • Extended an existing complex big data application consisting of more than 130 sources used by 30 different flows, more than a thousand KPIs, and 10TB of output data.
  • Developed a machine learning and artificial intelligence MVP.
  • Moved existing big data applications to newer Apache Spark versions from 1.6 to 2.4.
  • Developed a network infrastructure monitoring application based on the Neo4j graph database.
  • Developed a data ingestion framework based on Apache Sqoop and Oracle.
Technologies: Scala, Apache Spark, Big Data, Oracle, Hadoop, Apache Hive, Impala, Apache Kafka, Apache Sqoop, HBase, Python, Machine Learning, Unix, Bash, Shell Scripting, Solr, Web Services, Teradata, Datastage, Data Quality, Docker, ETL, Data Transformation, Data Engineering, Data Pipelines, Pipelines, Neo4j, GraphDB, Cypher, Spark, Play, PL/SQL

Senior Business Intelligence Developer

2015 - 2016
1&1 Internet
  • Extended the existing enterprise data warehouse system built in Sybase IQ (used for the operational data store, the ETL process, and data marts) and MicroStrategy (used as the front-end solution).
  • Migrated an enterprise data warehouse solution from a Sybase IQ columnar database to Microsoft SQL Server.
  • Moved various projects from Oracle to Microsoft SQL Server.
Technologies: SQL, ETL, Microsoft SQL Server, MicroStrategy

Head of IT Reporting and MS Development

2014 - 2015
UNIQA Asigurari
  • Designed and implemented a complete data warehouse system in a financial environment using Microsoft technologies.
  • Built a self-reporting platform based on OLAP cubes and web reports.
  • Developed a new web solution for all internal client requests and migrated and integrated the existing application into the new and more user-friendly tool.
  • Coordinated a small team of SQL, BI, and ASP.NET developers through all software development processes.
Technologies: Microsoft SQL Server, Data Warehousing, ETL, C#.NET, ASP.NET, Business Intelligence (BI), OLAP, SQL, SQL Server Reporting Services (SSRS), SQL Stored Procedures, T-SQL (Transact-SQL), SQL Views, Model View Controller (MVC), Entity Framework, LINQ, IT Project Management, Coaching, OLTP

Head of IT Reporting Service

2011 - 2014
UNIQA Asigurari
  • Coordinated activities of the IT Reporting Service, including analysis of requirements, development, documentation, acceptance, and automation of reports.
  • Designed, implemented, and maintained the reporting system based on Microsoft technologies.
  • Administrated the Microsoft SQL Server database instances, monitoring backups and logs and implementing a log shipping strategy, a database optimization and performance tuning, and a database automatic re-indexing strategy.
  • Migrated database instances from Microsoft SQL Server 2005 to Microsoft SQL Server 2012.
  • Moved the reporting web platform from SQL Server Reporting Services 2005 to SQL Server Reporting Services 2012.
  • Developed, modeled, and implemented data warehouses and star schema data marts.
  • Implemented an IBM Db2 to Microsoft SQL Server 2012 solution offering near real-time incremental replication.
  • Created POCs for different BI tools like QlikView and Tableau.
  • Organized the testing and user acceptance process in implementing new products within the company's core system.
  • Built windows and applications in C# and ASP.NET MVC.
Technologies: Microsoft SQL Server, Data Warehousing, SQL Server Reporting Services (SSRS), SQL Server Analysis Services (SSAS), T-SQL (Transact-SQL), ETL, Database Administration (DBA), Database Replication, OLAP, Data Marts, Star Schema, SSAS Tabular, Data Analytics

Software Developer

2005 - 2011
UNIQA Asigurari
  • Designed, developed, and maintained various reports in Microsoft SQL Server.
  • Administrated Microsoft SQL Server database instances, including backups, restores, log shipping, and database server performance monitoring.
  • Developed an ETL process to migrate the entire company core system database transactions based on Microsoft SQL Server to a new core system based on IBM Db2.
  • Built a set of complex end-of-month reports that became the official data source for actuarial, controlling, and regulatory reporting.
  • Devised and implemented a web reporting platform based on SQL Server Reporting Services.
  • Created various OLAP cubes based on SQL Server Analysis Services.
  • Implemented automatic regulatory reporting via Web Services.
  • Conceived and implemented ETL flows for automatically importing data from different sources, such as the Multicash app, to the core system.
Technologies: Microsoft SQL Server, SQL, ETL, SQL Stored Procedures, Reports, BI Reports, SQL Server Reporting Services (SSRS), SQL Server Analysis Services (SSAS), Web Services, SQL Performance, Database Administration (DBA), OLAP, IBM Db2, Data Analytics

Data Processing Pipelines

Within a Big Data environment based on Cloudera Hadoop distribution, I built various Scala and Apache Spark pipelines to process large amounts of data from multiple sources. The pipelines were interdependent, scheduled to run with individual calendars, and used different strategies for source data loading.

Data Science Platform

I developed a Scala platform module extension based on CQRS and Event Sourcing patterns that became very successful and is currently used by many banks. The platform has repeatability features to enable users to go back in time and reproduce previous results.

Apache Hive and Impala Optimization

I was involved in optimizing some big tables managed in Apache Hive and Impala, with over 300GB of new data added daily. These were causing performance problems, Apache Hive server metastore crashes, and instability of the entire reporting platform.

Data Warehouse for UNIQA Asigurari

I designed and implemented a data warehouse system in Microsoft SQL Server for transactions, sales, and claims analysis related to the insurance business. The system introduced self-reporting capabilities for company business users, who previously depended on technical users to gather and extract valuable data insights.

Project details:
• Microsoft SQL Server for the data warehouse and back-end part
• OLAP cubes accessed via Excel for analytics
• SQL Server Reporting Services for BI and the front-end part

Web Application for UNIQA Asigurari

I built a new and integrated web solution based on the needs and requirements of different corporate departments, including managing sales partners and brokers, time tracking, and integrating direct debit files received from banks.

I mainly used Microsoft technologies for this project: ASP.NET for the front-end interface, Microsoft SQL Server for the database, and Active Directory for authentication and authorization. Each user was granted access only to specific modules.

Languages

SQL, Scala, T-SQL (Transact-SQL), Python, C#.NET, C#, JavaScript, Stored Procedure, Bash, Cypher

Frameworks

Apache Spark, Spark, ASP.NET, Hadoop, Akka, .NET, ASP.NET MVC, Bootstrap, Swagger, Play

Paradigms

ETL, Business Intelligence (BI), OLAP, CQRS, Event Sourcing, Microservices, E2E Testing, Unit Testing, Model View Controller (MVC)

Tools

Impala, Git, IntelliJ IDEA, Apache Sqoop, Cloudera, Solr, Flink

Platforms

Docker, Linux, Oracle, Apache Kafka, Unix, Jupyter Notebook, Nexus, Kubernetes, Amazon Web Services (AWS)

Storage

Databases, Microsoft SQL Server, Apache Hive, Data Pipelines, SSAS Tabular, HBase, PostgreSQL, SQL Stored Procedures, SQL Views, DB2/400, SQL Functions, SQL Server Reporting Services (SSRS), Teradata, Datastage, SQL Server Analysis Services (SSAS), Database Administration (DBA), Database Replication, SQL Performance, IBM Db2, Neo4j, JSON, PL/SQL, OLTP

Other

Software Development, Programming, Data Engineering, APIs, Algorithms, Software Deployment, IT Project Management, Data Warehousing, MicroStrategy, Big Data, Machine Learning, ELT, Data Modeling, Triggers, Shell Scripting, Tox, Integration Testing, Shapeless, Web Services, Data Quality, Coaching, Data Marts, Star Schema, Reports, BI Reports, Data Transformation, Pipelines, GraphDB, Data Analytics

Libraries/APIs

Entity Framework, LINQ, Monix, JDBC

2011 - 2013

Master's Degree in Project Management Applied to Computer Science

Faculty of Cybernetics, Statistics, and Informatics; Bucharest University of Economic Studies - Bucharest, Romania

1998 - 2003

Bachelor's Degree in Computer Science

Faculty of Cybernetics, Statistics, and Informatics; Bucharest University of Economic Studies - Bucharest, Romania

1997 - 2002

Bachelor's Degree in Electronics and Telecommunications

Politehnica University of Bucharest - Bucharest, Romania

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring