Carlos Muñoz, Developer in Aguascalientes, Mexico
Carlos is available for hire
Hire Carlos

Carlos Muñoz

Verified Expert  in Engineering

Bio

Carlos is a data engineer and analyst with nine years of experience working mainly with the Microsoft technology stack and has expertise with SQL Server, SSIS/ADF, advanced T-SQL, and the Azure platform. He has designed, developed, and maintained numerous custom data solutions from infrastructure to final reporting for Fortune 500 companies in the IT, finance, and sales industries. Carlos focuses on understanding a client's data and bringing business value with his solutions on every project.

Portfolio

eSage Group
Azure SQL, Azure Data Lake, Azure Data Factory, Databricks, Azure Key Vault...
Clearco
Python 3, Python, Pandas, GitHub, Google Cloud, Google Cloud Platform (GCP)...
eSage Group
T-SQL (Transact-SQL), SharePoint, C#, Windows PowerShell, Azure...

Experience

Availability

Part-time

Preferred Environment

SQL, ETL Tools, Azure, Data Warehousing, Data, Databases

The most amazing...

...solution I've developed is a scanning-and-profiling tool for benchmarking Microsoft security competitors.

Work Experience

Senior Data Engineer

2020 - PRESENT
eSage Group
  • Developed a warehouse solution using the Microsoft tech stack for a scanning-and-profiling tool that benchmarks Microsoft security competitors.
  • Designed a weekly process to ingest scanning data, compare week-over-week results, and manage automatic alerts.
  • Analyzed data to design and provide insights into customer wins and losses and find patterns to train a future, machine-learning predictive model.
  • Worked in the discovery, design, and development of a Microsoft Azure-based reporting-and-analytics data platform responsible for serving the dashboard needs of an international chocolate manufacturer company.
  • Developed the Azure Data Factory pipelines for transforming external data using data flows and Databricks notebooks and an auditing procedure with custom logs and automatic mail notifications on Azure Logic Apps.
Technologies: Azure SQL, Azure Data Lake, Azure Data Factory, Databricks, Azure Key Vault, Azure Logic Apps, Microsoft Power Apps, Microsoft Flow, SQL, DAX, Database Design, JSON, ETL Tools, Azure, Data Warehousing, Data, Databases, Programming, SQL Server BI, SQL Server Integration Services (SSIS), APIs, T-SQL (Transact-SQL), SQL Stored Procedures, SQL Server Agent, Data Warehouse Design, Azure Virtual Machines, Microsoft Power BI, Stored Procedure, ETL, Data Modeling, DB, Microsoft SQL Server, Data Architecture, Data Analytics, Data Visualization, Data Cleaning, Data Engineering, Database Modeling, Pipelines

Data Integration Specialist

2021 - 2022
Clearco
  • Developed Python scripts to integrate different HR third-party products and Okta user creations. Utilized the pandas library for data manipulation and consuming REST API to ingest and submit data into the systems.
  • Deployed the scripts in a CentOS virtual machine on the company GCP environment. Managed all the cloud configuration, service account creation, shell scripting, cron jobs, and logs.
  • Migrated some of the scripts to a serverless solution using Google Cloud Functions and Github Actions to accomplish a CI/CD methodology.
Technologies: Python 3, Python, Pandas, GitHub, Google Cloud, Google Cloud Platform (GCP), CentOS, Google Cloud Functions, Okta, APIs, SFTP, GitHub Actions, Dayforce HCM, Data Cleaning, Data Engineering

Project Manager | Business Intelligence Engineer

2017 - 2020
eSage Group
  • Planned, designed, and deployed the data architecture and strategy for the Microsoft US BSO team based on the fiscal year customer requirements and aligned to the rhythm of the business and the business review cycles.
  • Assured the development team implementation of the SQL Server data warehouse, SSIS/ADF ETL solution, and the different reporting end-user solutions deployed on Excel, PowerApps, Power BI, and custom C# MVC web applications.
  • Developed a target cascade model to impact seller bonuses and geographic territory performance of sales units. Using the largest remainder method (T-SQL), the model calculated and distributed customer wins projections based on n-custom criteria.
  • Managed project activities and the work pipeline scope changes on the reporting and data models and implemented actions that brought value to the client and optimized resources.
  • Operationalized as a personal initiative, the gradual migration of all the on-premise infrastructure to Azure to achieve better performance and reduce costs (since most of our upstream sources live in Azure Data Lake and Azure resources).
Technologies: T-SQL (Transact-SQL), SharePoint, C#, Windows PowerShell, Azure, Azure Virtual Machines, Azure Data Lake, SQL Server Integration Services (SSIS), Microsoft Power BI, Microsoft SQL Server, DB, Stored Procedure, Data Engineering, Database Administration (DBA), Data Visualization, Data Architecture, Data Analytics, Data Cleaning, Spreadsheets, Database Modeling, Pipelines

Business Intelligence Engineer

2015 - 2017
eSage Group
  • Maintained and developed ETL processes and containers on SSIS, which included monitoring, troubleshooting, and facilitating escalations of data source issues.
  • Composed T-SQL scripts to recreate dimension and fact tables for the data warehouse.
  • Implemented a process to refresh, publish, and audit preliminary and final month-end MBR tools, KPI reports, executive dashboards, forecast tools, and management reports for the Microsoft US business operations and solutions team.
  • Developed and implemented a web application tool used in the MBR process to gather quarterly forecasts to generate KPIs and dashboards at different segment levels.
  • Built and maintained Excel-based reports and data models using Power Pivot and VBA Macros. The models ask for user input, process data based on business rules through the macros, and send the data to a final relational database table.
Technologies: T-SQL (Transact-SQL), SQL Stored Procedures, SQL Server Agent, IIS, SQL Server Analysis Services (SSAS), SQL Server Integration Services (SSIS), Data Warehouse Design, Power Pivot, Database Administration (DBA), Data Architecture, Data Analytics, Data Visualization, Excel VBA, Excel Macros, Excel 365, Visual Basic, Database Modeling

Development Team Lead

2012 - 2015
Softtek
  • Project-managed, developed, and maintained two web application portals on .NET MVC technology and SQL Server. The apps centralized security-related information from several tools and services while displaying dashboards for security leaders.
  • Developed the back-end data solution based on on-premise SQL Server and SSIS ETL tasks; it ingested data from various security services and tools into a database and then sent it to a reporting database for the web app and external connections.
  • Implemented the Agile methodology based on Scrum to support the quick adoption of new security tools, data ingestion, and turnaround of the final dashboards based on biweekly feedback from the security leaders.
Technologies: .NET, Web MVC, SQL Server BI, SQL Server Integration Services (SSIS), APIs, Data Warehousing

Lead Security Analyst

2009 - 2012
Softtek
  • Served as a project manager for the team and implemented the scanning and automatic testing of the applications; this included mentoring junior auditors and coordinating new initiatives and research to give additional value to customer services.
  • Actively participated in penetration testing engagements for external environments of GE Capital Europe which included the banking business, focusing on application vulnerability detection.
  • Led security assessments of web, client/server, and mainframe applications of diverse technologies and frameworks with gray-box and black-box techniques.
Technologies: Penetration Testing, Project Management, QualysGuard, Python, HP WebInspect

Security Auditor

2007 - 2009
Softtek
  • Performed vulnerability assessments focusing on detecting critical and high issues for GE's internal web applications.
  • Researched vulnerability assessment techniques and security tools for use in the internal process.
  • Developed a web application tool to manage the vulnerability assessment findings and generate the final report for the client.
Technologies: Ethical Hacking, Application Security, Vulnerability Assessment

Security Competitor Portal

I developed a back-end solution using the Microsoft tech stack for a scanning-and-profiling tool that benchmarks Microsoft security competitors. This work included a weekly process of automatic scans which were processed in AWS and transferred to Azure Data Lake using Azure Data Factory. After data was cleaned and transformed, it was then ingested into an SQL Server database with a star schema data warehouse model.

We then used stored procedures with complex logic to do a week-over-week comparison, add new domains/accounts, and flag alerts that trigger an Azure Logic Apps API to submit email notifications. Finally, the data was then sent to an Azure database for a custom web application to consume and then produce dashboards to let sellers and account executives input additional information for profiling.

Data Platform for an International Company

I was part of the design and development of an Azure modern data warehouse solution for an international chocolate manufacturer company.

The platform allows a scalable, maintainable method to start the migration of independent subsidiaries' data needs from different on-premise solutions and external tools.

I also contributed to the data architecture proposal based on the Azure Data Lake Gen2 storage with a raw, clean, and reporting data layers model. I used Azure Data Factory for the ingestion, transformation, and orchestration of data between layers using the dataflow component. We then built a final star schema relational database model for live reporting on Azure SQL databases to allow connections from Power BI dashboards and ad-hoc views for data analyst requirements.

In addition, I developed the pipelines for third-party external market share sources coming from CSV exported files. Besides the ETL logic and process, created the validation and auditing procedure with custom logs that are stored in an Azure SQL database and automatic mail notifications triggered by ADF on Azure Logic Apps.

Scorecard Reporting and Forecast Input Tool

The tool is a custom .NET MVC C# web application designed to be a 1-stop-shop portal for managing the US monthly business review process around the financial scorecard KPI's of an international technology company. I was the project manager and technical lead of a 4-person team that designed, developed, and maintained the solution.

My main activities were to design the back-end architecture to handle the monthly scorecard data coming from multiple data sources and handled by an ETL process using a hybrid approach on Integration Services and Azure Data Factory pipelines.

I created the SQL server algorithm to handle the forecast inputs submitted through the web application. The program was capable of gathering forecasts at three different levels of granularity and be able to roll up the numbers at all levels and update different measures depending on the specific logic of each KPI.

I was involved in the design and creation of Power BI reports which were later embedded into the web portal to give the drill-down ability for specific metrics and sales-unit details.
2005 - 2009

Bachelor's Degree in Information Technologies

ITESM Campus Aguascalientes - Aguascalientes, Mexico

AUGUST 2018 - PRESENT

Microsoft Partner Competency: Data Analytics

Microsoft

OCTOBER 2010 - OCTOBER 2012

Certified Ethical Hacker

EC Council

Libraries/APIs

Pandas

Tools

SQL Server BI, Power Pivot, Microsoft Excel, Microsoft Power BI, Azure Logic Apps, Microsoft Power Apps, Microsoft Flow, HP WebInspect, Azure Key Vault, Azure HDInsight, GitHub, Spreadsheets

Languages

SQL, T-SQL (Transact-SQL), Stored Procedure, C#, Excel VBA, Visual Basic, Java, C, Python, Python 3

Paradigms

Database Design, ETL, ETL Implementation & Design, Business Intelligence (BI), Penetration Testing

Platforms

Azure, SharePoint, QualysGuard, Databricks, Google Cloud Platform (GCP), CentOS

Storage

Databases, SQL Server Integration Services (SSIS), SQL Stored Procedures, SQL Server Agent, SQL Server Analysis Services (SSAS), Azure SQL, Microsoft SQL Server, DB, JSON, Data Pipelines, SQL Server DBA, MySQL, PostgreSQL, Database Administration (DBA), Database Performance, Data Lakes, Google Cloud, Database Modeling

Industry Expertise

Project Management

Frameworks

.NET, Windows PowerShell

Other

ETL Tools, Data Warehousing, Data, Programming, APIs, Data Warehouse Design, Azure Data Factory, Data Engineering, Data Reporting, ETL Development, Data Modeling, Excel 365, Pipelines, IT Security, Application Security, Vulnerability Assessment, IIS, Azure Virtual Machines, Azure Data Lake, DAX, Certified Ethical Hacker (CEH), Dashboards, Data Architecture, Data Analytics, Data Visualization, Excel Macros, Networks, Cisco, Web Development, Ethical Hacking, Web MVC, Google Cloud Functions, Okta, SFTP, GitHub Actions, Dayforce HCM, Data Cleaning

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring