Pablo Díaz Ogni, Developer in Buenos Aires, Argentina
Pablo is available for hire
Hire Pablo

Pablo Díaz Ogni

Verified Expert  in Engineering

Back-end Developer

Buenos Aires, Argentina

Toptal member since August 1, 2018

Bio

Pablo is a Python developer with more than a decade of experience in programming. He's been involved in software since high school, where he learned to code and think like a programmer. During his early career, he began developing with Python and Linux, and now, he considers them his strongest skills. Pablo has recently worked on projects involving APIs, stream-processing platforms, real-time bidding systems, scrapers, and more.

Portfolio

NewCars.com
Python, Amazon EC2, SaltStack, Pyramid, Pytest, APIs, PostgreSQL, DevOps...
Freedom Robotics
Python, Linux, REST APIs, AWS Lambda, Amazon Web Services (AWS)...
Sqreen
V8, PHP, Docker, Python, Test-driven Development (TDD), Bash, Linux, Git...

Experience

  • Back-end - 13 years
  • Test-driven Development (TDD) - 13 years
  • Python - 13 years
  • Bash - 13 years
  • Vim Text Editor - 12 years
  • Unit Testing - 12 years
  • Pytest - 6 years
  • Docker - 5 years

Availability

Part-time

Preferred Environment

Chrome, Vim Text Editor, Bash, Ubuntu, Linux, Python, Test-driven Development (TDD), Git

The most amazing...

...project I've done was develop and implement Jampp's data platform with Python and AWS services.

Work Experience

Senior Back-end Developer

2023 - PRESENT
NewCars.com
  • Maintained systems deployed through SaltStack, EC2, Terragrunt, and Terraform.
  • Contributed to different Python systems using the Pyramid framework on the NewCars.com site, which collects data to generate car buyer leads.
  • Refactored ad hoc recurrent tasks into tasks handled by the Airflow system.
Technologies: Python, Amazon EC2, SaltStack, Pyramid, Pytest, APIs, PostgreSQL, DevOps, AWS DevOps, Vagrant, Apache Airflow, Back-end, REST APIs, Object-relational Mapping (ORM), SQL

Python Back-end Developer

2019 - 2023
Freedom Robotics
  • Contributed to a Python server application that connected data from robots to our API using ROS. This application also interacts with commands the client sends through WebRTC to manipulate the device.
  • Maintained and improved the Python API hosted on AWS Lambda, implemented the Chalice framework and CloudFormation, and migrated the initial data warehouse from DynamoDB and S3 to Apache Kafka and Cassandra.
  • Created a statistic system that runs every minute to collect data from devices, read custom rules defined by clients, and develop statistics about them, such as downtime, missions accomplished, operative time, etc.
Technologies: Python, Linux, REST APIs, AWS Lambda, Amazon Web Services (AWS), Amazon DynamoDB, Apache Kafka, Git, Chalice, AWS CloudFormation, Pytest, Redis, PostgreSQL, Apache Cassandra, Elasticsearch, Robot Operating System (ROS), Amazon Simple Queue Service (SQS), CircleCI, Back-end, Object-relational Mapping (ORM), SQL, Lambda Functions

Python Developer (via Toptal)

2018 - 2018
Sqreen
  • Analyzed and optimized a Python daemon which intercepts requests from a web server and applies rules to decide if it needs to be blocked or not.
Technologies: V8, PHP, Docker, Python, Test-driven Development (TDD), Bash, Linux, Git, Back-end

Python Developer

2014 - 2018
Jampp
  • Developed a high-performance, real-time bidding system that needed to respond in less than 100 milliseconds; used Python, Tornado, Cython (for performance improvements), PostgreSQL, and ZeroMQ.
  • Built a fraud detection reporting system that performed daily queries and sent emails to campaign managers about possible fraudulent activities.
  • Developed the API used for generating multiple reports.
  • Executed the full development of a message streaming library that was used for sending messages from the bidder system to many others such as a machine learning predictor and a stream processing platform; used Python and ZeroMQ.
  • Developed and implemented the current streaming processing platform which reads messages from the bidder, sends them to Kinesis to be later stored not only raw in S3, but also aggregates them into PostgreSQL.
Technologies: Memcached, Redis, MySQL, PostgreSQL, AWS Lambda, Amazon Kinesis, Cython, Tornado, ZeroMQ, Amazon Web Services (AWS), Python, Test-driven Development (TDD), Bash, Linux, Git, Jenkins, Scrum, SQLAlchemy, Back-end, REST APIs, Object-relational Mapping (ORM), SQL, Lambda Functions

Developer

2011 - 2014
OLX
  • Created automation tests with Groovy and Selenium that run on a custom system built on Java.
  • Developed a model layer API (written in Python with Tornado) that was created to migrate the main web page database access to service calls; also tested the API using Nosetests and Lettuce.
  • Contributed to the development of the main page that was written in PHP and to multiple custom frameworks; implemented unit testing with PHPUnit.
  • Created and reused multiple tools for migrating data when the product team wanted to make substantial changes; the tools should create new categories trees and recategorize all the items.
  • Started working in an agile environment (scrum) with various meetings (daily, planning, and retrospective) all led by a scrum master.
Technologies: MySQL, Java, PHP, Tornado, Python, Test-driven Development (TDD), Bash, Linux, Git, Jenkins, Scrum, Back-end, REST APIs, SQL

Developer

2010 - 2011
AMAFRA Sistemas
  • Created, maintained, and added features to a large system with multiple modules used by clients across various fields, such as textile and painting, markets, recycling plants, sports, libraries, chemical supplies, ironwork, etc.
  • Provided client support via telephone and on-site.
  • Created a different system for racquetball match management.
Technologies: Visual Basic 6 (VB6)

Experience

Real-time Bidding System

https://jampp.com/
During my first years with Jampp, I developed a real-time bidding application that needed to respond to requests as quickly as humanly possible.

Jampp's main business involves buying programmatic ads for clients who are trying to reach a receptive audience that will then install and use the app.

This system receives requests from ad exchanges such as MoPub or Google Adx. For each request, it has to quickly decide whether to bid and which of its clients' ads to use. It is written with Python and Tornado and optimized with Cython. Every database access instance must be cached, so it won't bid on the first requests until all the data needed is cached.

My tasks involved adding features, profiling the system to find low-hanging fruits that can be improved, writing tests, deploying, monitoring, and handling incidents.

Stream-processing Library

This library was created to send messages from the bidders' instances to another system, which creates predictive models using machine learning to optimize bidding.

The library consists of three parts:
• Proxy/Main: This is the connection point for the publisher and the subscribers. It sends the client-requested filters to the publishers and forwards messages from all the publishers to all the connected subscribers.
• Publishers: This was installed on the system to generate messages. It applies filters and sends messages through the proxy to the correct clients.
• Clients: This section asks for specific messages (using filters) and receives them.

This library is intended to be used on streaming processing platforms; given the Pub/Sub messaging system, it could generate misses (in practice, it doesn't happen) but is oriented to process high volumes of traffic, so the misses won't make a real difference.

Jampp's Data Architecture

https://aws.amazon.com/solutions/case-studies/jampp/
Designed, developed, and implemented the new scalable data architecture that is used by all of Jampp's applications.

We migrated from PostgreSQL and MySQL databases to a new architecture that can be scaled up or down depending on the business needs. This was implemented using Python and ZeroMQ to receive messages and send them to Amazon Kinesis. Then, using AWS Lambda, Redis, and DynamoDB, the messages were related to each other in a transaction and later stored in Amazon S3. Then, another consumer sends the raw data to S3 to be later queried in an Amazon EMR cluster using Presto.

The second part of this project was performing the aggregation, reading directly from the Kinesis streams instead of using the relational tables from PostgreSQL. For all of the custom applications, we needed to build them in the middle of the chain, written in Python and (almost) fully unit tested.

Apartment Administration System

Created an administration system for a resort with around 40 apartment units in Las Gaviotas, Buenos Aires.

The system manages the rental dates, calculates the fees for the agency, and generates reports. I used Java, Swing, and MySQL and created a small PHP-based login site for apartment owners.

Education

2004 - 2009

Technical High School Diploma in Computer Techniques

ETN3 | María Sánchez de Thompson - Buenos Aires, Argentina

Certifications

DECEMBER 2011 - PRESENT

Scrum Certification

Kleer

Skills

Libraries/APIs

SQLAlchemy, ZeroMQ, REST APIs

Tools

Vim Text Editor, Pytest, Jenkins, Git, V8, Lettuce, AWS CloudFormation, Amazon Simple Queue Service (SQS), CircleCI, SaltStack, Vagrant, Apache Airflow

Languages

Python, Bash, SQL, Visual Basic 6 (VB6), PHP, Java, C

Paradigms

Unit Testing, Test-driven Development (TDD), Scrum, Object-relational Mapping (ORM), DevOps

Platforms

AWS Lambda, Amazon Web Services (AWS), Docker, Linux, Amazon EC2, Ubuntu, Apache Kafka

Frameworks

Chalice, Chrome, Nose, Swing, Pyramid

Storage

Redis, Amazon DynamoDB, PostgreSQL, MySQL, Amazon S3 (AWS S3), Memcached, Elasticsearch

Other

Back-end, Robot Operating System (ROS), Tornado, Amazon Kinesis, Lambda Functions, Cython, Programming, Algorithms, Apache Cassandra, APIs, AWS DevOps

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring