Pablo Mombru, Developer in Buenos Aires, Argentina
Pablo is available for hire
Hire Pablo

Pablo Mombru

Verified Expert  in Engineering

ETL Tools Developer

Location
Buenos Aires, Argentina
Toptal Member Since
June 18, 2020

Pablo has extensive experience in database programming and data management. He specializes in complete data migration processes implementations, from strategy design and data mapping up to ETL processes development, testing, and execution. He also has great experience in databases performance issues revision, diagnoses, and resolution. He has worked in teams of all sizes.

Portfolio

Inkspace Analytics, LLC - B.S.D. Capital, Inc. dba Lendistry (via Toptal)
KNIME, KNIME Node Development, Data Analysis, Data Migration, ETL Tools...
Duckpin
Database Modeling, ETL, Data Cleansing, Data Migration Testing, Data Migration...
Qvantel
Oracle PL/SQL, Database Modeling, ETL Tools, Database Programming...

Experience

Availability

Part-time

Preferred Environment

Windows

The most amazing...

...thing I've implemented is a complete data migration of almost 20 million customers' data with no errors, in the expected timeframe.

Work Experience

SQL Expert/Data Analyst

2021 - PRESENT
Inkspace Analytics, LLC - B.S.D. Capital, Inc. dba Lendistry (via Toptal)
  • Created, adapted, tested, and executed ETL processes using KNIME Analytics Platform to continuously migrate financial data from various sources, including MySQL database, Salesforce CRM, and USA's Gov, into a PostgreSQL database hosted on AWS infrastructure.
  • Elaborated complex queries to gather information from multiple sources, including MySQL, PostgreSQL, and Redshift databases hosted on AWS infrastructure.
  • Created and configured complex Data Build Tool (dbt) models to migrate data from relational databases like SQL Server and PostgreSQL to AWS Redshift DB, optimizing and fine-tuning dbt processes.
  • Created and implemented Apache Airflow's complex pipelines to run dbt execution and testing tasks to accomplish data migration processes.
  • Developed Python scripts to generate business reports in Excel format, extracting data from diverse data sources, and automated access to production databases.
  • Designed and developed complex Tableau workbooks tailored to business requirements, utilizing various data sources such as database tables, views, and Excel spreadsheets.
  • Managed and configured AWS tools such as Redshift database clusters, CloudWatch, S3 buckets, EC2 instances, and Secret Manager.
  • Analyzed, diagnosed, and resolved production issues by using different tools such as Jira, Confluence, AWS CloudWatch, New Relic, and Chrome browser's inspection option.
  • Implemented and executed Salesforce CRM outbound APIs to retrieve data, and configured and executed the Salesforce Bulk API for data upload tasks.
  • Executed SOQL commands in the Salesforce Developer Console and the Salesforce Workbench to pull data from instances.
Technologies: KNIME, KNIME Node Development, Data Analysis, Data Migration, ETL Tools, Salesforce API, SQL, MySQLdb, PostgreSQL, Amazon Web Services (AWS), Data Modeling, Salesforce SOQL/SOSL, Salesforce Bulk API, JSON, Python 3, Tableau, Apache Airflow, Data Build Tool (dbt), Data Engineering, Redshift, Issue Tracking, Excel 365, Cloud Applications

Data Migration Specialist

2020 - 2021
Duckpin
  • Analyzed and identified source data for migration from diverse databases, including MySQL DB, MS Access DB, and CSV files.
  • Designed and modeled a PostgreSQL database as the destination database for the data migration based on customer requirements and source database structure.
  • Created a data mapping document to map source database data to new destination database tables and fields and developed a data migration strategy and plan to achieve the migration.
  • Exported data from both source databases using the export tool of each DB (MySQL and MS Access) and Penatho's PDI Kettle processes.
  • Imported the source data into staging tables in the PostgreSQL destination database through the database import tool and Penatho's PDI Kettle processes.
  • Created PSQL functions in the PostgreSQL destination database to adapt and import the staging data into the final core tables.
  • Collaborated on cleaning and normalizing the imported data as a post-migration process through PSQL functions created in the PostgreSQL destination database.
  • Documented the whole data migration process and strategy.
Technologies: Database Modeling, ETL, Data Cleansing, Data Migration Testing, Data Migration, Database Design, SQL, Microsoft Access, PostgreSQL, Data Engineering, Excel 365

Data Migration Analyst

2017 - 2018
Qvantel
  • Created a data mapping document between the customer's legacy in-house system and Qvantel's BSS API-based destination system.
  • Created data profiling queries on Oracle DB legacy data for a quality check using Talend Tool profiling rules.
  • Designed and implemented a complete ETL process using Pentaho. This process included data extraction from the legacy staging area, data conversion to NoSQL destination structures, and JSON file generation as an outcome of the ETL process.
  • Conducted data migration requirements identification workshops with the customer.
  • Created Python programs for data reconciliation based on data migration results. As output, Excel files were generated with total migrated and extracted records.
Technologies: Oracle PL/SQL, Database Modeling, ETL Tools, Database Programming, Data Migration, ETL, Oracle RDBMS, SQL Functions, SQL Stored Procedures, Database Architecture, SQL, Database Design, Python, Data Engineering, Python 3

Senior Principal Consultant

2014 - 2017
Oracle Chile
  • Conducted functional requirements recollection workshops for accounts, contacts, and order management Siebel CRM modules with the customer.
  • Designed business and technical solutions for accounts, contacts, and order management Siebel CRM functional modules based on gathered functional requirements.
  • Created inbound and outbound online interfaces based on SOAP web services WSDL definition for accounts and contacts functional modules.
  • Configured automatic accounts/representatives assignments, using the Siebel Assignment Manager tool through creating assignment rules based on reps availability, reps workload, and particular attributes of accounts as a channel.
  • Configured business processes automation using Siebel Workflow processes for accounts and contacts creation functionality.
Technologies: Oracle PL/SQL, Database Modeling, ETL Tools, Database Programming, Data Migration, ETL, Oracle RDBMS, SQL Functions, SQL Stored Procedures, Database Architecture, SQL, Database Design, JavaScript, HTML, CSS, Siebel CRM

CRM Architect

2013 - 2014
Alliansys Argentina
  • Supported ongoing operation of Siebel CRM application assets, contacts, and order management modules.
  • Executed Siebel administration tasks to keep synchronized development, testing, and production repositories.
  • Handled performance issues related to a number of concurrent Siebel sessions running and open, by parametrization of Siebel server in a production environment.
  • Managed effort estimation plan for Siebel CRM application maintenance, based on identified new requirements and corrections on existent functionality.
Technologies: Oracle PL/SQL, Database Modeling, Database Programming, Oracle RDBMS, SQL, Database Design, Siebel CRM

IT Specialist

2012 - 2013
IBM Argentina
  • Configured Siebel PRM/IBM LDAP authentication interface by creating a connection profile in Siebel Server and corresponding binding users.
  • Resolved Siebel Partner cloud application performance issues by creating/modifying DB2 database table's indexes and applying parametrization in Siebel Server on concurrency on the server and the database.
  • Handled Siebel Partner cloud application web services configuration issues that were affected by the poor performance of the application.
  • Conducted data migration requirements gathering workshops with the customer.
  • Defined a Siebel CRM data migration strategy that included source and destination source data systems and the best approach to export, convert, and adapt source data to Siebel CRM structures and load data into destination structures.
Technologies: Oracle PL/SQL, Database Modeling, Database Programming, Data Migration, ETL, Oracle RDBMS, SQL Stored Procedures, Database Architecture, SQL, Database Design, Siebel CRM

CRM Architect

2011 - 2012
Capgemini Argentina
  • Handled technical design for Siebel's accounts, service requests, and order management business modules.
  • Adapted Siebel CRM order management's OOTB workflow processes in order to meet the customer's business requirements.
  • Handled online SOAP web services interface design and configuration based on WSDL definitions.
  • Performed improvements by adjusting the parametrization of the Siebel server related to server concurrent sessions parametrization.
  • Managed Siebel's patches installation on the Unix environment without affecting the regular operation of Siebel CRM.
Technologies: Oracle PL/SQL, Database Programming, Oracle RDBMS, SQL, Siebel CRM

CRM Architect

2010 - 2011
Cognizant Argentina
  • Handled Siebel CRM Cloud production instances configuration for different countries for accounts, contacts, and assets modules.
  • Executed data import and export batch interfaces for Siebel CRM Cloud production instances.
  • Created programs in C# to query, insert, and update data into Siebel CRM Cloud production instances by calling exposed Siebel's web services through proxy objects made and used in the programs.
Technologies: Oracle PL/SQL, Database Modeling, Database Programming, Data Migration, Oracle RDBMS, SQL, Siebel CRM

Principal Consultant

2006 - 2010
Oracle Argentina
  • Conducted Siebel CRM functional requirements gathering workshops with customers for accounts, contacts, and revenue modules.
  • Created complex Siebel CRM automated processes for data management.
  • Handled online SOAP web services interface design and configuration for Siebel CRM Accounts and Contacts modules. WSDL files were used to implement the interfaces to do updates on data.
  • Defined a Siebel CRM data migration strategy that included source and destination source data systems and the best approach to export, convert, and adapt source data to Siebel CRM structures and load data into destination structures.
  • Executed a complete data migration of 20 million customers into Siebel CRM. The data included accounts data, related contacts, related addresses, and corresponding billing and service profiles.
Technologies: Oracle PL/SQL, Database Modeling, ETL Tools, Database Programming, Data Migration, ETL, Oracle RDBMS, SQL Functions, SQL Stored Procedures, Database Architecture, SQL, Database Design, JavaScript, HTML, CSS, Siebel CRM

Senior Consultant

2000 - 2006
Siebel Systems Argentina
  • Handled technical design for opportunities and service requests modules of Siebel CRM.
  • Created workflow processes and business services for Siebel CRM application implementation for opportunities and service requests modules functionality.
  • Configured different designs of Siebel CRM user interface presentations for each of the four different bank brands. This was done by modification of CSS, HTML, and JavaScript internal Siebel files.
  • Defined a Siebel CRM data migration strategy that included source and destination source data systems and the best approach to export, convert, and adapt source data to Siebel CRM structures and load data into destination structures.
  • Executed a complete data migration of 10 million customer's data into Siebel CRM. The migrated data included accounts data, related contacts, assets, orders, order items, and inventory information.
  • Handled Siebel CRM version upgrades on the Windows platform.
Technologies: Oracle PL/SQL, Database Modeling, ETL Tools, Database Programming, Data Migration, ETL, Oracle RDBMS, SQL Functions, SQL, JavaScript, HTML, CSS, Siebel CRM

ETL Processes for Financial Institution

I was in charge of creating, maintaining, testing, and executing several ETL processes for continuous financial data migration into a PostgreSQL data warehouse from different sources, including MySQL database, Salesforce CRM, and US Government APIs.

Database Developer for Cleansing, Transforming, and Migrating Data

https://www.rollwithduckpin.com/
I was in charge of:
• Analyzing and identifying the source data to be migrated from different sources DBs such as MySQL DB, MS Access DB, and CSV files.
• Designing and modeling a PostgreSQL database as the destination database for the data migration based on customer requirements and source DB structure.
• Creating a data mapping document to map source DBs data to new destination DB tables and fields and creating a data migration strategy and plan to achieve the migration.
• Exporting data from sources DBs using each DB's export tool (MySQL and MS Access) and Penatho's PDI Kettle processes.
• Importing the source data into staging tables in the PostgreSQL destination DB through the DB import tool and Penatho's PDI Kettle processes.
• Creating PSQL functions in the PostgreSQL destination DB to adapt and import the staging data into the final core tables.
• Working on cleaning and normalizing the imported data as a post-migration process through PSQL functions created in the PostgreSQL destination DB.
• Documenting the whole data migration process and strategy.

Data Migration Analyst (via R.P.I.)

https://www.qvantel.com/company
I imported customers' data into a non-relational Cassandra DB through Kafka loaders. The information to load was organized in JSON files created by the Pentaho ETL process from the legacy system staging Oracle DB.

Telecom Argentina (via Oracle Argentina)

https://www.telecom.com.ar/
I migrated 20 million customer data into the Siebel CRM database without errors and on the expected date. The migrated information included customer data and related contacts, addresses, and billing/service profiles.

Merk (via Cognizant)

https://www.merckgroup.com/ar-es
I created programs in C# to access (query, insert, and update) data into Siebel CRM Cloud production instances by calling exposed Siebel's web services. The web services instances were called through a proxy object whose instance was created and managed in the C# programs.

Banco de Chile (via Siebel Argentina)

https://portales.bancochile.cl/
I configured different Siebel CRM user interfaces for the four other bank brands by modifying CSS, HTML, and JavaScript internal files. A different user interface was created for each brand with different background colors and font types.

Tigo Bolivia (via Qvantel and R.P.I)

https://www.tigo.com.bo/
For the customer to quickly check the outcome of the data migration, I developed a data reconciliation Python program to generate an automatic Excel report, based on data from Oracle databases, with total quantities of records.

This program:
• Obtained the information to be migrated from legacy Oracle DB.
• Obtained the converted and adapted data from intermediate Oracle DB.
• Generated pre and post-total records quantities for each migrated entity.
• Created an Excel spreadsheet to inform total quantities to be migrated and total quantities converted.

Languages

SQL, Python 3, Python, C++, HTML, CSS, JavaScript

Paradigms

Database Design, ETL

Storage

Oracle PL/SQL, Database Programming, Database Architecture, Database Modeling, SQL Stored Procedures, SQL Functions, Oracle RDBMS, SQL Performance, JSON, PostgreSQL, MySQLdb, Redshift

Other

Data Migration, ETL Tools, Data Migration Testing, Data Cleansing, KNIME Node Development, Data Analysis, Data Modeling, Cloud Applications, Data Build Tool (dbt), Data Engineering, Excel 365, Issue Tracking

Tools

Siebel CRM, Tableau, Microsoft Access, GitHub, Apache Airflow

Platforms

KNIME, Amazon Web Services (AWS), Salesforce SOQL/SOSL

Libraries/APIs

Salesforce API, Salesforce Bulk API

1990 - 1995

Bachelor's Degree in Computer Sciences

Catholic University of Argentina - Buenoa Aires, Argentina

APRIL 2012 - PRESENT

Siebel 8.1 Certified Consultant

EXO

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring