Martin Ignacio Alonso, Developer in Buenos Aires, Argentina
Martin is available for hire
Hire Martin

Martin Ignacio Alonso

Verified Expert  in Engineering

Data Engineer and Developer

Buenos Aires, Argentina

Toptal member since January 6, 2022

Bio

Martin is a data engineer with 15 years of experience in business intelligence and software development. After focusing on SAP business intelligence solutions for seven years at IBM and Capgemini, he transitioned to the Microsoft BI stack at Avaya and a smaller BI services firm. Martin's industry experience is backed by a bachelor's degree in computer science.

Portfolio

Clear-BI Consulting
C#, Microsoft SQL Server, SQL Server Integration Services (SSIS)...
Avaya
SQL Server DBA, Microsoft Power BI, Windows PowerShell, SharePoint 2013...
Capgemini
SAP Business Intelligence (BI), ETL, Business Intelligence (BI)

Experience

  • Microsoft SQL Server - 10 years
  • TFS - 8 years
  • Data Engineering - 7 years
  • SQL Server Analysis Services (SSAS) - 7 years
  • SQL Server Integration Services (SSIS) - 7 years
  • C# - 5 years
  • Einstein Analytics - 4 years
  • SSIS Custom Components - 4 years

Availability

Part-time

Preferred Environment

Windows, SQL Server Integration Services (SSIS), C#, SQL Server Analysis Services (SSAS)

The most amazing...

...thing I've developed is a notification system to integrate asynchronous loads from multiple OLAP sources into my own set of tabular data models.

Work Experience

Data Engineer

2021 - PRESENT
Clear-BI Consulting
  • Increased performance of ETLs and orchestration of SSIS packages.
  • Managed a team of SSAS consultants with various levels of expertise.
  • Interacted and developed good relationships with clients to set and manage expectations.
Technologies: C#, Microsoft SQL Server, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), ETL, Team Management, Data Engineering

Microsoft BI Specialist

2014 - 2021
Avaya
  • Developed and maintained SSIS ETLs to support multiple BI solutions plus interfaces and feeds.
  • Built and maintained Microsoft SQL Server databases, SSAS Tabular cubes, and Power BI and SSRS reports.
  • Configured DevOps processes to develop reporting packages, including SSAS Tabular cubes, SSRS reports, ETLs, and more.
  • Maintained development and production environments and ensured role separation.
  • Developed tools in Python, C#, and JavaScript to support process automation in my area of responsibility.
  • Installed and maintained an on-premise SQL Server 2012, including SSAS, SSIS, SSRS, and DBMS.
  • Installed and maintained an on-premise Windows 2008 R2 Server and Windows Server 2012.
  • Installed and maintained an on-premise, three-tier SharePoint 2013 server, supporting SSRS, PowerPivot, Power View, and Search Services.
  • Installed and maintained an on-premise Team Foundation Server (TFS) 2012.
  • Installed and maintained a Microsoft Power BI gateway to connect cloud and internal servers.
Technologies: SQL Server DBA, Microsoft Power BI, Windows PowerShell, SharePoint 2013, Power Pivot, Einstein Analytics, SQL Server Analysis Services (SSAS), SSIS Custom Components, SQL Server Integration Services (SSIS), C#, ETL, DevOps, Microsoft SQL Server, Business Intelligence (BI), Databases, SSAS Tabular, SSRS Reports, Python, JavaScript, Process Automation, Database Management Systems (DBMS), SQL Server 2012, TFS

Senior SAP BI Consultant

2009 - 2014
Capgemini
  • Participated in multiple end-to-end SAP implementation projects.
  • Served as a team leader for the SAP BI module on multiple projects.
  • Developed multiple ETLs and InfoCubes to meet business demands.
Technologies: SAP Business Intelligence (BI), ETL, Business Intelligence (BI)

SAP BI Consultant

2006 - 2009
IBM
  • Analyzed companies' KPIs from a BI perspective and assisted in creating new ones.
  • Activated business content to accelerate project advancement.
  • Assisted in multiple go-lives for different projects.
Technologies: SAP Business Intelligence (BI), Business Intelligence (BI), Key Performance Indicators (KPIs)

Experience

Mobile Data Load Monitoring

A live-connection Power BI report and a series of scripts that enable access to Active Directory service accounts in the most non-privileged way to the Microsoft database job tables. I also created several database views that joined everything together to feed a nice mobile view in Power BI. This project allows us to monitor all our loads in real time, organized by priority, without logging into the system.

Multidimensional Batch Loading System

https://github.com/LeMarto/Taiki
The main use case that motivated me to create this was extracting information from a Microsoft SSAS multidimensional cube from a server with very short timeout windows set up. As I needed to extract lots of data, I realized that instead of extracting one big chunk of data, I had to extract multiple smaller batches. The creation of these batches became very tedious and error-prone, so I created this library to assist me in creating the MDX query for each of the batches automatically. The library also helps in creating single-batch MDX.

Compensation System

Created a compensation calculation system with the following considerations:
• SOX compliance
• Separation of roles
• Microsoft SQL Server back end
• Power BI paginated reports for compensation owners to verify the numbers
• C# interface to input parameters and retrigger compensation calculation jobs

SSIS Custom Component to Trigger Informatica Linear Taskflows

https://github.com/LeMarto/ILTTSSISTask
One of the many ETLs I created uploads information to Einstein Analytics. I enabled this by leveraging the existing Informatica installation. I created this small SSIS component to trigger the Informatica linear task flow for loading the data from our database into Einstein Analytics.

Einstein Analytics Dataflow Overhaul

One of the main issues with out-of-the-box dataflow generators used by Einstein Analytics to start up a new instance is that they create big spaghetti dataflows, making it extremely difficult to maintain. In addition, because of the nature of our configuration, security predicates needed to be configured on all datasets.

I created different tiers of Einstein Analytics Dataflow:

Tier 1: Generated master data objects, such as opportunity, account, and partner. The main rule is that there are no augments based on EdgeMarts.

Tier 2: You only feed from EdgeMarts; that way, you don't have to reinvent the wheel every time you need to augment information from a standard object.

File Watcher System

https://github.com/LeMarto/FileWatcher
One of the many types of data sources we handle is flat files (CSV). Our flows usually extract the data from the CSV and insert it into a table that contains the correct data types. Power users access this table in a centralized way to create multiple reports. Some users upload these files into a shared windows folder.

Originally, these files were picked up by a SQL Agent job that would constantly check if a folder contained the file. The problem with this approach was that we would end up with a job log stating that a step took two days to run, which was technically true, but the actual processing time was minutes.

This discrepancy led me to create a C# Windows service that, based on a JSON configuration file, would listen to a set of folders, and whenever a new file was detected, a SQL Agent job would be triggered. With this solution, the actual execution times were accurate.

SSAS Trace System

https://github.com/LeMarto/Trace
A C# Windows service that traces SSAS Tabular 2012 transactions based on the 2012 version of ASTrace (https://github.com/microsoft/Analysis-Services/tree/master/AsTrace). I developed this version of the service that allowed us to do the inserts in the database by using a more secure stored procedure. I don't care for the actual query being executed, so I don't store it, but this can be changed easily.

Data Source Completition Notification System

Many of the tabular data models we create feed from multidimensional cubes developed by other areas. These cubes refresh daily at a recurring but not very precise time, which varies greatly depending on the day. Because of this restriction, a scheduled refresh of our data models was out of the question, as it would often miss the refresh of the source system.

This restriction led me to craft a database solution and SSIS ETL that queries the last refresh date of each source every ten minutes. Whenever an update on a specific data source is detected, a meta job containing all the SQL Agent jobs that depend on the said data source is triggered.

This system also has a series of stored procedures used to create jobs programmatically and hook up regular jobs to meta jobs of each relevant data source. The result of this project is a system that loads everything in a clever way without human intervention.

SSAS Tabular Enablement

I installed five different servers to enable my area to generate SSAS Tabular models:
1. SSAS 2012 in tabular mode with Power BI Gateway (production)    
2. SSAS 2012 in tabular mode with Power BI Gateway (development)
3. SQL Server 2012 with SSIS (production)
4. SQL Server 2012 with SSIS (development)
5. TFS 2012

All the service accounts were configured with Active Directory Users that had SPNs configured to enable Kerberos delegation. The SQL Servers are used as a back end to the SSAS servers. This allows us to create more complex cubes. I also implemented a small DevOps strategy to deploy data models into production by leveraging the TFS server. Finally, the Power BI Gateway enables us to create reports against on-premise cubes.

SharePoint 2013 Business Intelligence Implementation

Installation of SharePoint 2013 Server with three tiers—database, application, and front end—leveraging multiple technologies to create a portal to centralize all our business intelligence needs. This enabled my team to transform multiple Excel reports, originally refreshed manually on a daily basis, into an automated reporting system, allowing us to grow significantly.

Technologies:
• PowerPivot
• Power View
• Reporting Services
• Search services
• Kerberos delegation

Education

2003 - 2008

Bachelor's Degree in Computer Science

Universidad Argentina de la Empresa (UADE) - Buenos Aires, Argentina

Skills

Tools

Microsoft Power BI, Power Pivot, TFS, Power View

Paradigms

OLAP, DevOps, ETL, Business Intelligence (BI)

Platforms

Windows, SharePoint 2013

Storage

SQL Server Integration Services (SSIS), Databases, SQL Server Analysis Services (SSAS), SQL Server DBA, Microsoft SQL Server, SSAS Tabular, Database Management Systems (DBMS), SQL Server 2012, JSON, SQL Server Agent

Languages

C#, Python, JavaScript

Frameworks

Windows PowerShell

Other

Data Engineering, Programming, Computer Science, SAP Business Intelligence (BI), Einstein Analytics, SSIS Custom Components, Software Design, Kerberos, Team Management, SSRS Reports, Process Automation, Key Performance Indicators (KPIs), Multidimensional Expressions (MDX), Excel 365

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring