Marius Condescu, Developer in Hamburg, Germany
Marius is available for hire
Hire Marius

Marius Condescu

Verified Expert  in Engineering

Bio

Marius has over a decade of experience in various IT domains, varying from scriptings, such as Shell and Perl, to developing full .NET desktop applications with MySQL and SQL Server databases. Currently, Marius is a cloud data engineer with Databricks and Azure, using SQL, Spark, Azure Data Factory, and other Azure data technologies.

Portfolio

Beiersdorf
Apache Spark, Azure, Azure Data Factory (ADF), Azure Data Lake, Databricks, ETL...
Finastra
Databricks, Spark, Scala, Python 3, SQL, Azure Synapse, Azure Data Lake...
Endava
Microsoft Power BI, SQL Server Reporting Services (SSRS), SSAS, C#...

Experience

  • SQL - 8 years
  • ETL Implementation & Design - 5 years
  • Agile - 3 years
  • Scrum - 3 years
  • SSIS Custom Components - 2 years
  • Python - 2 years
  • PySpark - 1 year
  • Apache Spark - 1 year

Availability

Part-time

Preferred Environment

Git, Azure, Azure Data Factory (ADF), Azure Data Lake, Azure Synapse, Apache Spark, Databricks, Data Engineering, Scala, Python

The most amazing...

...thing I've made is an Azure Cloud data solution, helping the business to see sales trends and ad campaigns performance.

Work Experience

Senior Data Engineer

2021 - PRESENT
Beiersdorf
  • Built a data solution with Azure technologies for users to track and analyze ad campaign performance and sales information.
  • Worked with Jira, Azure DevOps, and Terraform to build and manage projects.
  • Led developers as a technical lead to deliver high-quality products.
Technologies: Apache Spark, Azure, Azure Data Factory (ADF), Azure Data Lake, Databricks, ETL, Data Warehousing, Data Warehouse Design, Cloud, Windows, SQL, Agile, Scrum, Visual Studio 2016, Visual Studio, SQL Server Management Studio (SSMS), ETL Implementation & Design, PySpark, Warehouses, Data Pipelines, Data Modeling, Microsoft Excel, T-SQL (Transact-SQL)

Data Engineer

2021 - 2021
Finastra
  • Developed a data pipeline that allowed business visibility in the mainframe system, using Databricks, Azure Data Factory, and Azure Synapse.
  • Wrote data transformations in Scala Spark using a Databricks framework.
  • Orchestrated execution and dependencies with Azure Data Factory.
Technologies: Databricks, Spark, Scala, Python 3, SQL, Azure Synapse, Azure Data Lake, Azure Data Factory (ADF), ETL, Data Warehousing, Azure, Data Warehouse Design, Cloud, Windows, Agile, Scrum, Visual Studio 2016, Visual Studio, SQL Server Management Studio (SSMS), ETL Implementation & Design, PySpark, Data Pipelines, Data Modeling, Microsoft Excel, T-SQL (Transact-SQL)

Senior Data Engineer

2018 - 2021
Endava
  • Developed an SSIS framework to support the ETL process.
  • Implemented SQL queries such as dynamic, aggregations, and others to support the ETL process.
  • Built SSIS custom components using SSIS and C# to enhance functionality.
Technologies: Microsoft Power BI, SQL Server Reporting Services (SSRS), SSAS, C#, SQL Server Integration Services (SSIS), Microsoft SQL Server, ETL, Data Warehousing, Data Warehouse Design, Windows, SQL, Agile, Scrum, Visual Studio 2016, Visual Studio, SQL Server Management Studio (SSMS), ETL Implementation & Design, PySpark, Warehouses, Data Pipelines, Data Modeling, Microsoft Excel, T-SQL (Transact-SQL)

BI Developer

2017 - 2018
EXE Software SRL
  • Added automation, logging, and auditing to an existing ETL solution.
  • Created a BI solution using Excel files as a source—the client was using Excel files for storing and analyzing data and they have reached the Excel capabilities quite fast.
  • Developed custom components for SSIS tool using C# to overcome SSIS standard capabilities.
  • Created a Cloud BI solution for users to map their Excel files to the relevant columns and use that mapping to load any source files, regardless of their order and names.
  • Designed a data warehouse database for storing big data extracted from sources, using partitions and a column store index.
Technologies: Microsoft Power BI, SQL Server Reporting Services (SSRS), SSAS, C#, SQL Server Integration Services (SSIS), Microsoft SQL Server, ETL, Data Warehousing, Data Warehouse Design, Windows, SQL, Visual Studio 2016, Visual Studio, SQL Server Management Studio (SSMS), ETL Implementation & Design, SSIS Custom Components, TFS, Warehouses, Data Pipelines, Data Modeling, Microsoft Excel, T-SQL (Transact-SQL)

BI Developer

2016 - 2017
Elefant SA
  • Developed a business intelligence (BI) solution—an ETL process using SQL Server stored procedures and SSIS, a multi-dimensional cube using SSAS.
  • Built OLTP reports; the source data was loaded from Excel files to stage tables, transformed, and matched with MySQL database information, and loaded into a SQL Server database; the report was an ASP.NET page, using DevExpress Framework.
  • Wrote warehouse workload reports; these were based on SQL Server data and the bar charts graphic was created to report the workload in the company's warehouse.
  • Created an automated system to download Google Analytics information concerning the company's website and to load the data into SQL Server database.
  • Wrote Power BI reports with the company's sale and stock information.
Technologies: DevExpress, ASP.NET, SSAS, SQL Server Integration Services (SSIS), Microsoft SQL Server, ETL, Data Warehousing, Data Warehouse Design, Windows, SQL, Visual Studio, SQL Server Management Studio (SSMS), ETL Implementation & Design, SSIS Custom Components, TFS, Warehouses, Data Pipelines, Microsoft Excel, T-SQL (Transact-SQL)

SQL Developer

2015 - 2016
Freelance Work
  • Created and maintained the application's database.
  • Implemented changes to the business requirements as they appeared in the database and application.
  • Developed the desktop application interface using C# and DevExpress.
  • Created the SQL stored procedures to support the application's ORM.
Technologies: DevExpress, C#, Microsoft SQL Server, Windows, SQL, Visual Studio 2016, Visual Studio, SQL Server Management Studio (SSMS), ETL Implementation & Design, SSIS Custom Components, Warehouses, Data Pipelines, Microsoft Excel, T-SQL (Transact-SQL)

Application Support

2012 - 2013
Cargus International SRL
  • Developed intranet sales reports, using MySQL and Perl with JavaScript.
  • Maintained and developed the local MySQL database.
  • Automated the mail distribution of daily/weekly/monthly reports.
  • Helped migrate the intranet to a new CentOS server.
Technologies: MySQL, JavaScript, PHP, Perl, Windows, Linux, Bash, Microsoft Excel, T-SQL (Transact-SQL)

Education

2018 - 2019

Master's Degree in Computer Programming

Spiru Haret University - Bucharest, Romania

2001 - 2005

Specialization in Computer Science (Programming)

IC Vissarion High School - Titu, Romania

Certifications

SEPTEMBER 2018 - PRESENT

MCSA: SQL 2016 Business Intelligence Development - Certified 2018

Microsoft

SEPTEMBER 2018 - PRESENT

70-768: Developing SQL Data Models

Microsoft

JULY 2018 - PRESENT

70-767: Implementing a Data Warehouse

Microsoft

NOVEMBER 2017 - PRESENT

70-461 | Querying Microsoft SQL Server 2012/2014

Microsoft

JULY 2017 - PRESENT

70-463 | Implementing a Data Warehouse with Microsoft SQL Server 2012/2014

Microsoft

Skills

Libraries/APIs

PySpark

Tools

Microsoft Excel, Visual Studio, Git, Microsoft Power BI, TFS, SSAS

Languages

T-SQL (Transact-SQL), SQL, PHP, JavaScript, C#, Perl, Bash, Scala, Python 3, Python

Paradigms

ETL, ETL Implementation & Design, Agile, Scrum

Storage

Microsoft SQL Server, SQL Server Integration Services (SSIS), SQL Server Data Tools (SSDT), Data Pipelines, SQL Server Management Studio (SSMS), SQL Server Reporting Services (SSRS), MySQL, PostgreSQL, Oracle RDBMS

Platforms

Windows, Visual Studio 2016, Azure, Linux, MacOS, Oracle, Databricks, Azure Synapse

Frameworks

ASP.NET, Spark, Apache Spark

Other

Data Warehousing, Cloud, Warehouses, Data Modeling, SSIS Custom Components, Data Warehouse Design, DevExpress, Azure Data Lake, Azure Data Factory (ADF), Data Engineering

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring