Verified Expert in Engineering
Data Engineer and Developer
Carlos is a data engineer and analyst with nine years of experience working mainly with the Microsoft technology stack and has expertise with SQL Server, SSIS/ADF, advanced T-SQL, and the Azure platform. He has designed, developed, and maintained numerous custom data solutions from infrastructure to final reporting for Fortune 500 companies in the IT, finance, and sales industries. Carlos focuses on understanding a client's data and bringing business value with his solutions on every project.
SQL, ETL Tools, Azure, Data Warehousing, Data, Databases
The most amazing...
...solution I've developed is a scanning-and-profiling tool for benchmarking Microsoft security competitors.
Senior Data Engineer
- Developed a warehouse solution using the Microsoft tech stack for a scanning-and-profiling tool that benchmarks Microsoft security competitors.
- Designed a weekly process to ingest scanning data, compare week-over-week results, and manage automatic alerts.
- Analyzed data to design and provide insights into customer wins and losses and find patterns to train a future, machine-learning predictive model.
- Worked in the discovery, design, and development of a Microsoft Azure-based reporting-and-analytics data platform responsible for serving the dashboard needs of an international chocolate manufacturer company.
- Developed the Azure Data Factory pipelines for transforming external data using data flows and Databricks notebooks and an auditing procedure with custom logs and automatic mail notifications on Azure Logic Apps.
Data Integration Specialist
- Developed Python scripts to integrate different HR third-party products and Okta user creations. Utilized the pandas library for data manipulation and consuming REST API to ingest and submit data into the systems.
- Deployed the scripts in a CentOS virtual machine on the company GCP environment. Managed all the cloud configuration, service account creation, shell scripting, cron jobs, and logs.
- Migrated some of the scripts to a serverless solution using Google Cloud Functions and Github Actions to accomplish a CI/CD methodology.
Project Manager | Business Intelligence Engineer
- Planned, designed, and deployed the data architecture and strategy for the Microsoft US BSO team based on the fiscal year customer requirements and aligned to the rhythm of the business and the business review cycles.
- Assured the development team implementation of the SQL Server data warehouse, SSIS/ADF ETL solution, and the different reporting end-user solutions deployed on Excel, PowerApps, Power BI, and custom C# MVC web applications.
- Developed a target cascade model to impact seller bonuses and geographic territory performance of sales units. Using the largest remainder method (T-SQL), the model calculated and distributed customer wins projections based on n-custom criteria.
- Managed project activities and the work pipeline scope changes on the reporting and data models and implemented actions that brought value to the client and optimized resources.
- Operationalized as a personal initiative, the gradual migration of all the on-premise infrastructure to Azure to achieve better performance and reduce costs (since most of our upstream sources live in Azure Data Lake and Azure resources).
Business Intelligence Engineer
- Maintained and developed ETL processes and containers on SSIS, which included monitoring, troubleshooting, and facilitating escalations of data source issues.
- Composed T-SQL scripts to recreate dimension and fact tables for the data warehouse.
- Implemented a process to refresh, publish, and audit preliminary and final month-end MBR tools, KPI reports, executive dashboards, forecast tools, and management reports for the Microsoft US business operations and solutions team.
- Developed and implemented a web application tool used in the MBR process to gather quarterly forecasts to generate KPIs and dashboards at different segment levels.
- Built and maintained Excel-based reports and data models using Power Pivot and VBA Macros. The models ask for user input, process data based on business rules through the macros, and send the data to a final relational database table.
Development Team Lead
- Project-managed, developed, and maintained two web application portals on .NET MVC technology and SQL Server. The apps centralized security-related information from several tools and services while displaying dashboards for security leaders.
- Developed the back-end data solution based on on-premise SQL Server and SSIS ETL tasks; it ingested data from various security services and tools into a database and then sent it to a reporting database for the web app and external connections.
- Implemented the Agile methodology based on Scrum to support the quick adoption of new security tools, data ingestion, and turnaround of the final dashboards based on biweekly feedback from the security leaders.
Lead Security Analyst
- Served as a project manager for the team and implemented the scanning and automatic testing of the applications; this included mentoring junior auditors and coordinating new initiatives and research to give additional value to customer services.
- Actively participated in penetration testing engagements for external environments of GE Capital Europe which included the banking business, focusing on application vulnerability detection.
- Led security assessments of web, client/server, and mainframe applications of diverse technologies and frameworks with gray-box and black-box techniques.
- Performed vulnerability assessments focusing on detecting critical and high issues for GE's internal web applications.
- Researched vulnerability assessment techniques and security tools for use in the internal process.
- Developed a web application tool to manage the vulnerability assessment findings and generate the final report for the client.
Security Competitor Portal
We then used stored procedures with complex logic to do a week-over-week comparison, add new domains/accounts, and flag alerts that trigger an Azure Logic Apps API to submit email notifications. Finally, the data was then sent to an Azure database for a custom web application to consume and then produce dashboards to let sellers and account executives input additional information for profiling.
Data Platform for an International Company
The platform allows a scalable, maintainable method to start the migration of independent subsidiaries' data needs from different on-premise solutions and external tools.
I also contributed to the data architecture proposal based on the Azure Data Lake Gen2 storage with a raw, clean, and reporting data layers model. I used Azure Data Factory for the ingestion, transformation, and orchestration of data between layers using the dataflow component. We then built a final star schema relational database model for live reporting on Azure SQL databases to allow connections from Power BI dashboards and ad-hoc views for data analyst requirements.
In addition, I developed the pipelines for third-party external market share sources coming from CSV exported files. Besides the ETL logic and process, created the validation and auditing procedure with custom logs that are stored in an Azure SQL database and automatic mail notifications triggered by ADF on Azure Logic Apps.
Scorecard Reporting and Forecast Input Tool
My main activities were to design the back-end architecture to handle the monthly scorecard data coming from multiple data sources and handled by an ETL process using a hybrid approach on Integration Services and Azure Data Factory pipelines.
I created the SQL server algorithm to handle the forecast inputs submitted through the web application. The program was capable of gathering forecasts at three different levels of granularity and be able to roll up the numbers at all levels and update different measures depending on the specific logic of each KPI.
I was involved in the design and creation of Power BI reports which were later embedded into the web portal to give the drill-down ability for specific metrics and sales-unit details.
SQL, T-SQL (Transact-SQL), Stored Procedure, C#, Excel VBA, Visual Basic, Java, C, Python, Python 3
SQL Server BI, Power Pivot, Microsoft Excel, Microsoft Power BI, Azure Logic Apps, Microsoft Power Apps, Microsoft Flow, HP WebInspect, Azure Key Vault, Azure HDInsight, GitHub, Spreadsheets
Database Design, ETL, ETL Implementation & Design, Business Intelligence (BI), Penetration Testing
Azure, SharePoint, QualysGuard, Databricks, Google Cloud Platform (GCP), CentOS
Databases, SQL Server Integration Services (SSIS), SQL Stored Procedures, SQL Server Agent, SQL Server Analysis Services (SSAS), Azure SQL, Microsoft SQL Server, DB, JSON, Data Pipelines, SQL Server DBA, MySQL, PostgreSQL, Database Administration (DBA), Database Performance, Data Lakes, Google Cloud, Database Modeling
ETL Tools, Data Warehousing, Data, Programming, APIs, Data Warehouse Design, Azure Data Factory, Data Engineering, Data Reporting, ETL Development, Data Modeling, Excel 365, Pipelines, IT Security, Application Security, Vulnerability Assessment, IIS, Azure Virtual Machines, Azure Data Lake, DAX, Certified Ethical Hacker (CEH), Dashboards, Data Architecture, Data Analytics, Data Visualization, Excel Macros, Networks, Cisco, Web Development, Ethical Hacking, Web MVC, Google Cloud Functions, Okta, SFTP, GitHub Actions, Dayforce HCM, Data Cleaning
.NET, Windows PowerShell
Bachelor's Degree in Information Technologies
ITESM Campus Aguascalientes - Aguascalientes, Mexico
Microsoft Partner Competency: Data Analytics
Certified Ethical Hacker