Mikio Oba, Developer in Lugano, Switzerland
Mikio is available for hire
Hire Mikio

Mikio Oba

Verified Expert  in Engineering

Data Engineer and Software Developer

Location
Lugano, Switzerland
Toptal Member Since
January 11, 2022

Mikio is an independent data engineer who loves solving clients' problems using his analytical and problem-solving skills based on the rational approach and 20+ years of industry experience. He is a lifelong learner who continuously pursues knowledge for personal and professional reasons. Mikio combines his passion for learning new technologies with his software engineering skills to continue helping clients.

Portfolio

PepsiCo Global - Main
Python, SQL, Data Engineering, Apache Airflow, Snowflake...
UBS
Python, PySpark, Linux, Shell Scripting, Ansible, Jira, GitLab CI/CD, Pytest...
UBS
SQL, Linux, Shell Scripting, Jira, Data Engineering, Data Warehousing, ETL...

Experience

Availability

Part-time

Preferred Environment

Linux, Python, SQL, Amazon Web Services (AWS), Python 3

The most amazing...

...feedback I've received from my coworkers is that my work was impeccable as always.

Work Experience

Data Engineer

2022 - 2023
PepsiCo Global - Main
  • Architected comprehensive refactoring of several complex data ingestion pipelines by leveraging Airflow, dbt, and Snowflake technologies, successfully enhancing efficiency and maintainability to ensure a more robust and scalable data infrastructure.
  • Engaged in cross-functional collaboration with a team of data scientists to develop and deploy data pipelines tailored explicitly for a new market segment, enhancing the data-driven decision-making framework for strategic market positioning.
  • Initiated and executed the integration of enhanced data observability features within the Airflow data pipelines by utilizing the Monte Carlo technologies, improving the reliability and accuracy of the data processing operations.
Technologies: Python, SQL, Data Engineering, Apache Airflow, Snowflake, Amazon Web Services (AWS), Python 3, Data Build Tool (dbt), Monte Carlo, Data Pipelines, ETL, ELT, GitHub, Amazon S3 (AWS S3), AWS Lambda, Git, Unit Testing, Pandas, Agile, System Testing, Databases, Relational Databases, Continuous Delivery (CD), Pytest, Jira, Linux

Python Software Engineer

2020 - 2022
UBS
  • Developed and maintained Apache Spark and Python (PySpark) applications that process Avro messages between operational and analytical systems used by data scientists.
  • Built single-handedly an integration test framework in Python for Spark applications.
  • Managed the quality assurance process according to the company's software development lifecycle as a test manager.
  • Solved various production incidents based on analytical and rational approaches.
  • Created and shared technical tutorials on the basics and advanced usage of Pytest.
Technologies: Python, PySpark, Linux, Shell Scripting, Ansible, Jira, GitLab CI/CD, Pytest, Apache Kafka, Data Engineering, Pandas, Agile, Big Data, Git, Data Pipelines, Unit Testing, System Testing, Databases, ETL, Relational Databases, Continuous Delivery (CD), DevOps, SQL, Python 3

SQL Data Warehouse Developer

2019 - 2021
UBS
  • Developed SQL-based ETL applications for compliance, security, and marketing purposes in the company’s petabyte-sized data warehouse.
  • Contributed to the project to build a shareholder reporting system as a primary software developer.
  • Solved various production incidents based on analytical and rational approaches.
Technologies: SQL, Linux, Shell Scripting, Jira, Data Engineering, Data Warehousing, ETL, Databases, Agile, ETL Development, Relational Databases, Git, Data Pipelines, Unit Testing, Continuous Delivery (CD), DevOps, Python 3

Data Warehouse Engineer

2013 - 2018
John Lewis Partnership
  • Built a company’s extensive customer data warehouse system from scratch.
  • Optimized various analytical queries and batch jobs to reduce the execution time by an order of magnitude.
  • Provided technical support to users and developers regarding Db2 issues.
  • Developed a few dozen SQL procedures for ETL applications and database administration.
  • Implemented disaster recovery plans for the data warehouse system.
Technologies: SQL, Linux, Shell Scripting, SQL Stored Procedures, Data Engineering, Data Warehousing, ETL, Databases, ETL Development, Python, Git, Unit Testing, Relational Databases, Continuous Delivery (CD)
2005 - 2006

Master's Degree in Information Systems

The University of Sheffield - Sheffield, United Kingdom

1997 - 1999

Master's Degree in Aeronautics and Astronautics

The University of Tokyo - Tokyo, Japan

1995 - 1997

Bachelor's Degree in Aeronautics and Astronautics

The University of Tokyo - Tokyo, Japan

APRIL 2022 - APRIL 2024

SnowPro Core Certification

Snowflake

APRIL 2022 - APRIL 2024

Certified Terraform Associate

HashiCorp

MARCH 2022 - MARCH 2025

AWS Certified Developer – Associate

Amazon Web Services

FEBRUARY 2022 - FEBRUARY 2025

AWS Certified Solutions Architect – Associate

Amazon Web Services Training and Certification

JANUARY 2022 - PRESENT

Data Structures and Algorithms Nanodegree

Udacity

Libraries/APIs

PySpark, Pandas, REST APIs

Tools

Apache Airflow, Terraform, Git, Ansible, Jira, GitLab CI/CD, Pytest, Amazon Athena, AWS Glue, AWS Step Functions, GitHub

Storage

SQL Stored Procedures, Relational Databases, Databases, PostgreSQL, Data Pipelines, MySQL, Amazon S3 (AWS S3)

Languages

SQL, Python, Snowflake, JavaScript, Python 3

Frameworks

Spark, Hadoop

Platforms

Linux, Apache Kafka, Amazon Web Services (AWS), AWS Lambda

Paradigms

Agile Software Development, Continuous Delivery (CD), DevOps, ITIL, ETL, Unit Testing, Agile, UX Design

Other

Data Warehousing, Data Engineering, ETL Development, Data Build Tool (dbt), Management Information Systems (MIS), Programming Languages, Shell Scripting, Big Data, Machine Learning, Data Structures, Algorithms, Exploratory Data Analysis, Data Analysis, AWS Cloud Architecture, AWS DevOps, DevOps Engineer, Cloud, Infrastructure as Code (IaC), Data Cleansing, Data Virtualization, System Testing, Amazon RDS, Technical Writing, Tutorials, Translation, Website Translation, Monte Carlo, ELT

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring