Rachel Park, Developer in Los Angeles, CA, United States
Rachel is available for hire
Hire Rachel

Rachel Park

Verified Expert  in Engineering

Bio

Rachel is a big data professional experienced in various domains, including robotics, biotech R&D, entertainment/media, and healthcare. With 10+ years of experience in data mining and machine learning technologies, she's a proactive leader with strengths in communication and collaboration. With her expertise in the ML ecosystem, both on-prem and cloud-based, Rachel promotes automated data solutions and manages concurrent objectives to drive efficiency and influence positive outcomes.

Portfolio

Elevance Health
Python, SQL, ETL, PySpark, Data Analytics, Data Modeling, Data Profiling, Spark...
Culmen International
Technical Leadership, JavaScript, Python, Visualization Tools, Technical Hiring...
Hart Inc.
Apache Airflow, Scala, Python, Spark, Hadoop, PySpark, Azure, Authentication...

Experience

Availability

Part-time

Preferred Environment

Anaconda, Visual Studio Code (VS Code), Vector Databases, APIs, Amazon Web Services (AWS), Google Cloud Platform (GCP), Apache Airflow, Spark ML, Kubernetes

The most amazing...

...experience I've had in Toptal was when I led cross-disciplinary teams to deploy predictive ML models for Navy-funded projects, securing extended funding.

Work Experience

Senior Machine Learning Engineer | Team Lead

2020 - PRESENT
Elevance Health
  • Led an ML Engineering project component team to oversee the entire product cycle, from designing and engineering solutions to configuring and deploying services while supporting software in the cloud (AWS, GCP) and on-prem environments.
  • Owned GPU-based computing APIs that support data scientists with model development. Dockerized different code bases and implemented E2E pipelines for production in Airflow. Implemented the automated release in GitLab CI/CD for testing and deployment.
  • Provided mentorship and technical guidance to team members, fostering a collaborative and innovative work environment (code review, literature review, best code practices, design/architecture solutions, etc.).
  • Implemented automated release in GitLab CI/CD for testing and deployment. Developed Kubernetes applications. Collaborated with stakeholders to define project goals, requirements, and timelines, ensuring alignment with business objectives.
Technologies: Python, SQL, ETL, PySpark, Data Analytics, Data Modeling, Data Profiling, Spark, Machine Learning, Deep Learning, Kubernetes, Architecture, Technical Leadership, Python 3, Back-end, Flask, Full-stack, Chatbots, OpenAI, Software Architecture, PostgreSQL, AWS Cloud Architecture, Pandas, Data Science

Tech Lead

2021 - 2022
Culmen International
  • Collaborated closely with government data scientists and domain experts to understand project objectives and requirements, facilitating seamless communication between technical and non-technical stakeholders.
  • Spearheaded prototyping of a predictive model for estimating the longevity of CAD/PAD devices, leveraging machine learning techniques and domain-specific knowledge.
  • Crafted a government proposal outlining a comprehensive architecture for deploying a machine learning (ML) application in the cloud.
Technologies: Technical Leadership, JavaScript, Python, Visualization Tools, Technical Hiring, Machine Learning, Kubernetes, Architecture, Python 3, Back-end, Flask, Full-stack, Software Architecture, AWS Cloud Architecture, Pandas, Data Science

Data Engineer

2019 - 2020
Hart Inc.
  • Designed and developed an ML-based health data search engine via automated schema prediction.
  • Modified existing databases to meet unique needs and goals determined during initial evaluation and planning process.
  • Wrote scripts and processes for data integration and bug fixes in Python, Scala, and Java.
  • Planned, engineered, configured, and deployed ML tooling and big data solutions while supporting software in a Hadoop-Spark ecosystem.
Technologies: Apache Airflow, Scala, Python, Spark, Hadoop, PySpark, Azure, Authentication, Scraping, Web Scraping, MySQL, Data Scraping, Machine Learning, Python 3, Back-end, Flask, Architecture, Software Architecture, PostgreSQL, AWS Cloud Architecture, Pandas, Data Science

TechOps Engineer

2018 - 2019
Telescope Inc.
  • Built business logic for voting applications and directly support the world's largest live shows such as The Voice, American Idol, and Dancing with the Stars.
  • Advised and provided versatile big data solutions using AWS (Dynamo, S3, EC2, etc.) to meet needs of clients.
  • Wrote unit tests in Python to automate product validation.
  • Researched and developed the integration of smart home devices to the current platform, provided prototypes as a proof of concept to executives/clients, and improved profit margins by launching new add-on projects for existing clients.
Technologies: Amazon Web Services (AWS), Apache Kafka, Flume, RESTful Development, REST APIs, XML, SQL, JavaScript, Spark, Python, Vector Databases, Authentication, APIs, Scraping, Web Scraping, MySQL, Data Scraping, Selenium, Python 3, Back-end, Flask, PostgreSQL, AWS Cloud Architecture, Pandas

Development Engineer/Engineering Consultant

2016 - 2018
UCLA, Various Startups (Vortex Biosciences Inc., Ferrologix Inc., etc)
  • Wrote code to automate statistical analysis on vision data (live/recorded microscopic images/videos) using data science and computer vision tools in Python, MATLAB, and R.
  • Trained employees on usage of aforementioned codes remotely.
  • Addressed R&D issues from data mining perspective in developing microfluidics platforms for medical applications.
  • Authored publications in peer-reviewed scientific journals.
  • Consulted start-up companies and assist director with project management.
Technologies: Amazon Web Services (AWS), R, MATLAB, Linux, Tableau, SQL, Python, Robotics, Python 3, Computer Vision, Image Processing, Image Generation

Robotics Researcher

2014 - 2016
UCLA
  • Developed algorithms to be tested on custom humanoid platforms using Simulink, Python, C++, Lua, ROS, LabView, and COMSOL.
  • Maintained robot platforms using CAD, 3D rapid prototyping and CNC mill.
  • Competed in DARPA Robotics Challenge Final as Team THOR. (USA, Jun 2015).
  • Competed in RoboCup as Team THORwIn (China, Jul 2015) – 1st place winner in the adult-sized humanoid open platform.
  • Work as robotics education outreach activity coordinator.
Technologies: Amazon Web Services (AWS), Java, Robot Operating System (ROS), C++, R, MATLAB, Python, Robotics, Arduino, Raspberry Pi, Python 3, Computer Vision, Image Processing, Image Generation

Data Type Predictor

Utilizing Spark MLLib, it predicts data types of fields in a given database. Useful when the inferred data type is not accurate or insufficient. Also, it serves as a key to finding primary key and foreign key relationship.

Semantic Type Predictor

https://sherlock.media.mit.edu
This is a PySpark implementation of the deep learning tool Sherlock.
This implementation predicts the semantic type of fields in a given database. It is useful for cleaning data and matching schema.

Single Cell Image Identifier

Given a blood sample provided by terminal cancer patients, the goal is to identify circulating tumor cells from various other cells (mostly white blood cells) and debris in the blood by analyzing the morphology of these cells and applying machine learning techniques.
2014 - 2016

Master of Science Degree in Mechanical Engineering

University of California, Los Angeles - Los Angeles, CA

2009 - 2013

Bachelor of Science Degree in Biomedical Engineering

Johns Hopkins University - Baltimore, MD

JANUARY 2020 - PRESENT

Big Data Hadoop Certification

Edureka

Libraries/APIs

Spark ML, PySpark, Pandas, REST APIs

Tools

Apache Airflow, Spark SQL, MATLAB, PyCharm, Atom, Sublime Text, Flume, Tableau

Languages

Python 3, SQL, Python, XML, C++, Scala, Java, JavaScript, R, C#

Frameworks

Spark, Selenium, Flask, Hadoop, Django

Paradigms

ETL, RESTful Development

Platforms

Amazon Web Services (AWS), Docker, Kubernetes, Visual Studio Code (VS Code), Google Cloud Platform (GCP), Linux, Databricks, Azure, Arduino, Raspberry Pi, Anaconda, Apache Kafka

Storage

PostgreSQL, MySQL

Other

Machine Learning, Deep Learning, Scraping, Web Scraping, Data Scraping, Technical Leadership, Architecture, Back-end, Chatbots, Software Architecture, AWS Cloud Architecture, Data Science, Vector Databases, Authentication, APIs, Robotics, Full-stack, OpenAI, Computer Vision, Image Processing, Image Generation, Robot Operating System (ROS), Visualization Tools, Technical Hiring, Data Analytics, Data Modeling, Data Profiling, Big Data

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring