Laurentiu Diaconu, Developer in Bucharest, Romania
Laurentiu is available for hire
Hire Laurentiu

Laurentiu Diaconu

Verified Expert  in Engineering

Data Architect and Machine Learning Developer

Location
Bucharest, Romania
Toptal Member Since
October 1, 2020

Laurentiu is passionate about bringing people together via data, technology, and experiences to create better people, outcomes, revenue growth, and operational efficiency. He is the VP of Data Engineering for Verumex, building the future of commercial real estate data analytics. Previously, as the CTO of Matchpoint, Laurentiu used AI to disrupt tennis training. He has also worked with global banks and insurers as the leader of the advanced analytics practice at Softelligence.

Portfolio

Verumex, Inc
Snowflake, Python, SQL, Microsoft Power BI, Visio, Terraform, Data Vaults...
Verumex, Inc
Snowflake, Python, SQL, Microsoft Power BI, Visio, Data Vaults...
Verumex, Inc
Snowflake, SQL, Python, Amazon Web Services (AWS), Microsoft Power BI...

Experience

Availability

Part-time

Preferred Environment

Microsoft Power BI, SQL Server Management Studio (SSMS), Microsoft Visio, Linux, Windows, MacOS, Visual Studio Code (VS Code), DataGrip, Poetry, Snowflake

The most amazing...

...experience was creating a Data Center of Excellence from the ground up in 2021 with 12 very talented engineers to innovate commercial real estate analytics.

Work Experience

VP Data Engineering

2021 - PRESENT
Verumex, Inc
  • Implemented next-generation, declarative continuous integration and delivery for data warehouses using Terraform to manage more than 800 artifacts like tables, views, stored procedures, streams, functions, tasks, stages, privileges, and others.
  • Developed an enterprise data model to manage data generated from multiple microservices that efficiently assess data quality in the data warehouse for multiple intra-day downstream analyses.
  • Implemented strategy on using Power BI data models as a common semantic layer architecture used for data quality analysis and serving dashboards across multiple customer endpoints.
  • Designed and implemented a framework to generate dynamic SQL scripts based on conventions and lineage that can be executed parallel to ELT data from more than 60 sources into more than 200 data vault, 2.0 compliant destinations.
  • Designed microservice architecture to dynamically execute transformation steps on source data from a wider array of customers (2+) and source data providers (80+) using dbt and Snowflake.
  • Managed and grew multiple engineering teams and professionals: Data Engineering with 3+ Senior Data Engineers, Data Visualization with 2+ Data Visualization Developers, Data Integration with 2+ Data Integration Engineers, and Quality Assurance with 1+ Quality Assurance Engineers.
  • Ran 10+ data architecture workshops from business design to physical design, involving 20+ people in multiple domains, such as leasing, acquiring properties, disposing of properties, modeling investments returns, sourcing leverage, and managing debt.
  • Managed the data platform's enterprise data architecture, consisting of an enterprise data warehouse (layered, data vault, and Kimball dimensional models), an ELT generator, multiple executors, and a dynamic transformation service, leveraging dbt.
Technologies: Snowflake, Python, SQL, Microsoft Power BI, Visio, Terraform, Data Vaults, Kimball Methodology, Data Warehouse Design, Amazon Web Services (AWS), Azure, Microsoft Visio, Miro, Microservices, Data Quality Management

Data Architect | Technical Lead

2021 - 2021
Verumex, Inc
  • Developed data vault and dimensional data models to store, standardize, and analyze data for subjects related to commercial real estate like property acquisitions, financial modeling, accounting, and analyzed data for more than 900 properties.
  • Directly managed a team of three senior data engineers from the perspective of goal setting, performance appraisal, and code reviews.
  • Oversaw the implementation of continuous integration and delivery, using Sqitch for more than 600 data warehouse artifacts like tables, views, stored procedures, streams, and functions.
Technologies: Snowflake, Python, SQL, Microsoft Power BI, Visio, Data Vaults, Kimball Methodology, Data Warehouse Design, Amazon Web Services (AWS), Azure, Microsoft Visio

Senior Data Engineer

2020 - 2021
Verumex, Inc
  • Developed a dimensional data model to analyze data related to leasing and charging tenants in commercial real estate properties from more than 21 source systems.
  • Developed ingestion pipelines and steps to acquire, transform and store data from over 50 source systems.
  • Developed datasets and dashboards in Power BI to present standardized rent roll reports for more than 80 properties.
Technologies: Snowflake, SQL, Python, Amazon Web Services (AWS), Microsoft Power BI, Microsoft Visio, Kimball Methodology, Data Vaults

CTO

2019 - 2021
Matchpoint. (innAIte technologies)
  • Developed a state-of-the-art object detection model and greatly optimized it for mobile inference on iOS, resulting in a mobile app running 60 FPS inferences.
  • Created a mobile application that uses near-real-time insights from an optimized object detection model on edge to determine tennis player placement and stroke patterns used to extract valuable tennis training insights.
  • Developed a mobile application that uses near-real-time insights from an optimized object detection model on edge to determine moving patterns of up to 200 instances of 80 different object types in high-traffic areas.
Technologies: Mobile App Development, Swift, iOS, Artificial Intelligence (AI), Python, Computer Vision, PyTorch

Team Lead, Advanced Analytics

2019 - 2020
Softelligence
  • Led the team that developed an ML solution on how to greatly optimize the referral process for a leading London market insurance organization. This saves up to 900,000 minutes in operational time and over 200,000 GBP in salary costs.
  • Designed a big data analytics solution around Apache Spark for a leading London market insurance organization that transformed how claims are being managed and analyzed. This solution handles over 500 GB of data and produces near-real-time analytics.
  • Created an ML solution that enables insurance organizations to detect fraudulent claims early in the process for lines of business like health, property, and motor. This makes the processing of claims 28% faster as no time is spent looking for fraud.
  • Provided consultancy services to one of the leading leasing companies on the Romanian market on envisioning a greatly optimized loan application process that uses ML to yield a competitive offer for car products 80% faster.
Technologies: Big Data Architecture, Big Data, ELT, Data Lake Design, Data Warehouse Design, Scikit-learn, Pandas, NumPy, SQL, Data Lakes, Microsoft Power BI, Delta Lake, Databricks, Apache Spark, Azure, Data Engineering, Database Architecture, Artificial Intelligence (AI), Machine Learning, R, Python

Data Enablement Consultant

2016 - 2019
Softelligence
  • Developed an IFRS9 reporting solution to a leading unsecured loan company in the UK that uses advanced statistical analysis to estimate losses for over 7 million contracts from five different markets each week.
  • Acted as team lead and data architect for developing a statutory reporting solution to a leading unsecured loan company in Romania that uses advanced statistical analysis to estimate losses for over 300,000 contracts every week.
  • Led the development of an ISDA management and reporting solution to a leading Canadian and global investment bank that uses statistical analysis to estimate the optimal price at which to trade complex investment vehicles for over 5,000 clients.
  • Delivered three different hands-on ML workshops for the Bucharest community, spanning over 50 different attendees per session and talking about different applied use cases for banks and insurance companies.
Technologies: Loss Modeling, Workshop Facilitation, Dimensional Modeling, ETL, Data Warehouse Design, Machine Learning, Database Design, Data Analytics, Microsoft SQL Server, SQL, Python

YOLOV5 Network Decoder Optimized for CoreML

https://1drv.ms/u/s!Ahvhw3awWX1ngo10DO_s9fdUimMTCg?e=AevyrA
A Python solution for CoreML that consumes the raw predictions of a YOLOV5 trained model (https://github.com/ultralytics/yolov5) that has been converted to CoreML format and efficiently decodes the predictions including NMS.

The solution is fully compatible with the iOS Vision Framework, thus multiple models converted using this solution can be replaced with a drag-and-drop behavior in a Swift mobile application.

Matchpoint.

A Swift-based application that uses highly efficient CoreML Computer Vision models (converted from PyTorch) to track individuals and tennis balls while practicing wall tennis drills at home. The application enables the users to start new challenges of tennis training at home and saves the best records against a worldwide leaderboard.

Languages

Python, SQL, R, Swift, Visual Basic .NET (VB.NET), Snowflake

Tools

Microsoft Visio, Microsoft Power BI, PyCharm, TensorBoard, Azure Machine Learning, Tableau, Spark SQL, DataGrip, Visio, Terraform, Miro

Paradigms

Data Science, Database Design, Siamese Neural Networks, ETL, Dimensional Modeling, ETL Implementation & Design, Kimball Methodology, Microservices

Platforms

Windows, Visual Studio Code (VS Code), Linux, Jupyter Notebook, Azure, Databricks, iOS, MacOS, Amazon Web Services (AWS)

Storage

SQL Server Management Studio (SSMS), Relational Databases, Database Architecture, Microsoft SQL Server, Data Lakes, Data Lake Design

Other

Data Mining, Machine Learning, Data Engineering, Neural Networks, Deep Learning, Data Mashup, Time Series Analysis, Data Visualization, Decision Modeling, Decision Support Systems, Statistics, Economics, Artificial Intelligence (AI), Delta Lake, Data Analytics, Convolutional Neural Networks (CNN), Computer Vision, Deep Neural Networks, Hyperparameters, Regularization, Mobile App Development, Metrics, Recurrent Neural Networks (RNNs), Big Data, RevoScaleR, Data Analysis, Exploratory Data Analysis, Bayesian Statistics, Gradient Descent, Gradient Boosted Trees, Analytics, Econometrics, Financial Analysis, Valuation, Optimization, Data Warehouse Design, Workshop Facilitation, Loss Modeling, ELT, Big Data Architecture, ETL Development, Streaming Data, Parquet, Row-Level Security, Administration, Job Schedulers, Poetry, Data Vaults, Data Quality Management

Frameworks

Apache Spark, Hadoop, Core ML, Spark Structured Streaming

Libraries/APIs

TensorFlow, PyTorch, NumPy, Pandas, Scikit-learn, PySpark

Industry Expertise

Accounting

2018 - 2020

Master of Science Degree in Applied Statistics and Data Science

Bucharest University of Economic Studies - Bucharest, Romania

2015 - 2018

Bachelor of Science Degree in Accounting and Management Information Systems

Bucharest University of Economic Studies - Bucharest, Romania

NOVEMBER 2020 - PRESENT

ETL Part 3: Production

Databricks

NOVEMBER 2020 - PRESENT

ETL Part 2: Transformations and Loads

Databricks

OCTOBER 2020 - PRESENT

Structured Streaming

Databricks

OCTOBER 2020 - PRESENT

ETL Part 1: Data Extraction

Databricks

MARCH 2020 - PRESENT

Convolutional Neural Networks by deeplearning.ai (Deep Learning Specialization)

Coursera

DECEMBER 2019 - PRESENT

Structuring Machine Learning Projects by deeplearning.ai (Deep Learning Specialization)

Coursera

DECEMBER 2019 - PRESENT

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization

Coursera

OCTOBER 2019 - PRESENT

Neural Networks and Deep Learning by deeplearning.ai (Deep Learning Specialization)

Coursera

MAY 2019 - PRESENT

MCSA: Machine Learning

Microsoft

APRIL 2019 - PRESENT

Intermediate R

DataCamp

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring