Jorge Guillermo Oquendo Barriga, Developer in Arequipa, Peru
Jorge is available for hire
Hire Jorge

Jorge Guillermo Oquendo Barriga

Verified Expert  in Engineering

Data Analysis Developer

Location
Arequipa, Peru
Toptal Member Since
June 18, 2020

Jorge is a data enthusiastic and engineer with an MSc in information technology management. During his professional career, he has worked in different roles related to data processing being developer, DBA, analyst, and data engineer getting knowledge and experience in different technologies like SQL, MySQL, Oracle, PostgreSQL, MongoDB, R, Python. He likes to ask himself the questions of how we can store, organize and exploit the data.

Availability

Part-time

Preferred Environment

SQL, SQLyog, Visual Studio Code (VS Code), GitHub, MacOS

The most amazing...

...project I've worked on was the data migration from MySQL and MongoDB, adding these data to other on production databases.

Work Experience

Data Engineer

2015 - 2018
LenddoEFL (former Entrepreneurial Finance Lab)
  • Processed and standardized legacy data in database in order to be used for reports and product development.
  • Migrated production data from MySQL and MongoDB databases before turn off, to online production databases.
  • Developed scripts to pre-process data in production environment in order to be consumed by the application.
  • Developed database views, procedures and queries to generate report of state and sales.
  • Created a dashboard getting data from different databases to summary volume and performance.
  • Created alert threshold triggers based on database data to monitor errors and issues.
  • Created scripts to process and adapt to system external data needed to develop new products.
  • Worked on a system to import and report performance data.
Technologies: Slack, Jira, GitHub, Microsoft Power BI, R, MongoDB, MySQL, SQL

Commercial Planning Analyst

2012 - 2015
Corporación Lindley S.A.
  • Developed the set of reports to track KPI, daily, monthly and per sales periods.
  • Created sheets and charts to monitor performance of commercial activities.
  • Assisted managers with updated and real data about promotions and campaigns.
  • Estimated sales volume and coordinate production requirements for special campaigns.
Technologies: SAP, Macros, Visual Basic for Applications (VBA), SQL

Database Administrator

2010 - 2011
Servicio Nacional de Capacitación para la Industria de la Construcción
  • Deployed and configured testing and develop database environments.
  • Set the database backup policy and configured the process needed to implement in production.
  • Defined and implemented the database security controls, audit, and roles.
  • Developed procedures, functions, and views along with the development team.
Technologies: SQL, RMAN, Oracle

Responsible for Information Systems

2008 - 2009
Inmobiliaria Parque de Paz
  • Managed and supported a 24x7 Server and mail system for the company.
  • Developed a system of calculation and payment of commissions reducing the error tax and complaints.
  • Directed the process of debugging and digital correction of contracts improving the confidentiality of the database.
  • Created SQL reports about performance, sales, commissions, and others that assisted managers and administrators.
Technologies: Microsoft Exchange Server, Microsoft, Java, SQL, Microsoft SQL Server

Co-founder, Project Manager, Analyst and Developer

2003 - 2007
LOGOS Ingenieros de Sistemas
  • Built a commercial and administration system for mid-size enterprises, leveraging the business to the automation of its processes.
  • Installed and configured MS Server and MS SQL in client’s infrastructure to work with our products in a continuous basis.
  • Implemented a complete set of reports and queries to assists our clients in the exploitation of their information.
  • Developed the functionality to automatize the backup and restore process of databases giving the client the autonomy to manage its own system.
Technologies: Delphi, Microsoft SQL Server

Performance Data Manager

In my role as data engineer, I received the task to work on a program with the goal of facilitating the validation of quality data and the process of import it in our databases.

The scope of PERFORMAN project was to automatize the process of review, standardize, format, evaluate requirement compliance and import financial data from plain sources into databases. This process ended with quality data being imported and reports about status sent to stakeholders.

The importance of this project resides in the utilization of the information it validates, in the use of this data to evaluate the quality and performance of the company's products.

I’m proud of the capacity and skills of the people that worked in this project along with the two phases it has.

It was developed with Python and the data imported in MySQL.

Volume Information Report and Dashboard

Finance and Administration department along with Product Management required a monthly report showing the status of volume per partner, per product and channel.

This was an interesting project due to the origin of information we have, properly two MySQL databases in different countries.

I used Power BI to connect to sources and extract the data, in the back it calls database views designed to protect and pre-process the information in each database. With Power BI I aggregated the data and created the final charts and reports.

This project eases the gathering of data for the reports.

Database Migration

Eventually, we needed to close one of our databases in a remote datacenter, this database was serving some clients under its particular country regulations.

Part of the data belongs to the client and we cannot move out that, but most of the data was generated by us. My assignment was to import our data into our production databases online. This data was stored in MySQL and MongoDB.

After analyzing and understanding the integrity and relationships of the entities I proceeded to export the data and created the scripts to import that data considering the links and special keys. I planned the scheduled to import the data without the need of stopping the services, then I executed it.

All was a job made with SQL and MongoDB Aggregation Framework.

Process Non System Data

One of our more important clients wanted us to create a customized product, in order to do that we got an important amount of historical data. This data was not in the format and in the usual standard we use to work with, also it was codified.

We needed to find the relation between the different sources of that data and make our system logic work with it.

I developed a set of scripts in R to process the data phase by phase in order to allow us to implement our business logic.

Finally, we processed more than two years of data on time.

Commissions Program

We had a sales staff that use to get paid its commissions every week. This manual process has different parameters and performance rules to calculate the correct amounts. In addition, these parameters could change frequently.

We had to analyze, design and developed a program to do these calculations, some of the features of this project were that this software must allow configure its parameters, correct if needed, add new rules and keep the record oy payments.

It allowed to ensure the process, keep records for audit and accelerate substantially the time of the process and pay.

It was developed with Java, SQL, and MS SQL Server.

Languages

SQL DDL, Data Manipulation Language (DML), SQL, R, Visual Basic for Applications (VBA), Delphi, Python 3, Java

Storage

Oracle 10g, MySQL, Database Modeling, Microsoft SQL Server, Microsoft Exchange Server, PostgreSQL, MongoDB, Oracle SQL Developer

Other

Data Analysis, ERM, Data Engineering, Data Cleansing, Macros, SAP, Certified ScrumMaster (CSM), PMI, Big Data, Unstructured Data Analysis, Data Warehouse Design, Google BigQuery

Tools

SQLyog, RMAN, Microsoft Power BI, Jira, Slack, Spyder, GitHub, Google Cloud Dataproc, MySQL Workbench

Paradigms

Business Intelligence (BI), Object-oriented Programming (OOP)

Libraries/APIs

NumPy, Pandas

Platforms

MacOS, Oracle, Microsoft, Linux, Google Cloud Platform (GCP), Jupyter Notebook, RStudio, Visual Studio Code (VS Code)

2014 - 2015

Diploma in Project Management

Instituto para la Calidad PUCP - Peru

2009 - 2011

Master of Science Degree in Management of Information Tecnology

La Salle BCN - Spain

2009 - 2011

Master of Science Degree in Information Technology Management

ESAN University - Peru

1992 - 1997

Bachelor of Engineering Degree in Systems Engineering

Universidad Católica Santa María - Peru

SEPTEMBER 2019 - PRESENT

Data Engineering, Big Data and, Machine Learning on GCP

Coursera

AUGUST 2019 - PRESENT

Python for Data Science

EDX

JUNE 2019 - PRESENT

Mathematics for Machine Learning

Coursera

APRIL 2019 - APRIL 2021

Certified Scrum Master

Scrum Alliance

NOVEMBER 2018 - PRESENT

Business Intelligence Concepts, Tools and Applications

Coursera

NOVEMBER 2018 - PRESENT

Big Data Analysis: HIve, Spark SQL, Dataframes and Graphframes

Coursera

OCTOBER 2018 - PRESENT

Relational Databse Support for Data Warehouses

Coursera

OCTOBER 2018 - PRESENT

Big Data Essentials: HDFS, MapReduce and Spark RDD

Coursera

AUGUST 2018 - PRESENT

M121: The MongoDB Aggregation Framework

MongoDB

JUNE 2018 - PRESENT

M101P: MongoDB for Developers

MongoDB

OCTOBER 2016 - PRESENT

The R Programming Environment

Coursera

JANUARY 2016 - PRESENT

Introduction to Big Data

Coursera

NOVEMBER 2015 - PRESENT

Reproducible Research

Coursera

OCTOBER 2015 - PRESENT

Exploratory Data Analysis

Coursera

AUGUST 2015 - PRESENT

R Programming

Coursera

AUGUST 2015 - PRESENT

Getting and Cleaning Data

Coursera

JUNE 2015 - PRESENT

The Data Scientist's Toolbox

Coursera

FEBRUARY 2015 - PRESENT

Tackling the Challenges of Big Data

EDX

FEBRUARY 2015 - PRESENT

Foundations of Data Analysis

EDX

OCTOBER 2014 - PRESENT

Explore Statistics with R

EDX

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring