Storme Briscoe, Developer in Thornton, CO, United States
Storme is available for hire
Hire Storme

Storme Briscoe

Verified Expert  in Engineering

Back-end Developer

Thornton, CO, United States

Toptal member since April 1, 2022

Bio

Storme is a senior back-end engineer with over eight years of experience building highly scalable APIs and apps. He specializes in Java and Go, working with various databases like PostgreSQL and MySQL and implementing different event architectures that heavily use Kafka and other similar tools.

Portfolio

Uber
Go, MySQL, Prometheus, Grafana, Apache Kafka, Docker, Apache Cassandra, Slack...
USAA
Java, Go, Amazon Web Services (AWS), MySQL, Apache Kafka, Terraform, Prometheus...
American Express
Go, Apache Kafka, PostgreSQL, Slack, Jira, Git...

Experience

Availability

Part-time

Preferred Environment

Windows 10, Slack, Go, Java, Apache Kafka, PostgreSQL, MySQL, Jira, Git, Grafana

The most amazing...

...thing I've developed is authentication based on heartbeat detection and facial recognition in video data. It resulted in a patent.

Work Experience

Software Engineer

2021 - PRESENT
Uber
  • Led a project where freight loads were automatically scheduled at both pickup and dropoff based on different pieces of data, such as facility hours and scheduling methods. It's an event-based architecture written in Go.
  • Met with both product managers and customers to help understand issues in the scheduling space and pitch ideas on how to fix those issues.
  • Maintained another Go service that generates appointment times for freight loads. The service minimizes the amount of dead time for a trucker and also finds appointment times that will increase the margin of a given freight load.
  • Set standards around monitoring and logging on new and existing services. This includes creating custom metrics to see the overall health of a service, custom alerts based on metrics, and other alerts based on the results of Kibana queries.
Technologies: Go, MySQL, Prometheus, Grafana, Apache Kafka, Docker, Apache Cassandra, Slack, Jira, Git, Object-oriented Programming (OOP), Data Structures, Databases, APIs, Back-end, Back-end Development, Software Architecture, SQL, Zendesk API, Architecture, REST APIs, Unix

Software Engineer

2019 - 2021
USAA
  • Built cloud infrastructure on AWS. Wrote various Terraform modules and reference applications using those modules and helped application teams troubleshoot issues when migrating their apps to AWS.
  • Built and maintained several codebases (Java 6 and 8) centered around core customer data. Each of the different subdomains (email, telephone, privacy) handles tens of millions of transactions a week.
  • Led design meetings and did major development on a new sync process between two different data sources. The new process immediately handled failed sync items and double-checked changes made on a given day to ensure the two sources were in sync.
  • Set standards, so my team started using Prometheus and Grafana to track metrics for all new projects. This served as a way to gauge the health of a project at any point in its flow.
  • Acted as the formal mentor for a developer on my team. Scheduled paired programming sessions each week and frequent one on ones to ensure they were getting the support they needed.
  • Conducted interviews for interns, new college hires, and experienced candidates.
Technologies: Java, Go, Amazon Web Services (AWS), MySQL, Apache Kafka, Terraform, Prometheus, Grafana, Slack, Jira, Git, Object-oriented Programming (OOP), Data Structures, Databases, Spring, Docker, APIs, Back-end, Back-end Development, Software Architecture, SQL, Architecture, REST APIs, Unix

Software Engineer

2018 - 2019
American Express
  • Collaborated with various teams to create an engineering platform in Go that can be used by any other engineering team at American Express.
  • Followed an inner-source philosophy where our team will leverage libraries and repositories other teams have written, doing the same with our libraries.
  • Created automatic builds with GitLab and Jenkins on each Git commit and automatic alerts when a build fails due to a test failure or compile error. There were automated deployments once a PR was merged.
  • Followed the Agile methodology with two-week sprints, estimating the time needed for stories, and paired programming. Every story was demoed to a lead engineer or PO.
Technologies: Go, Apache Kafka, PostgreSQL, Slack, Jira, Git, Object-oriented Programming (OOP), Data Structures, Databases, Prometheus, Grafana, Docker, APIs, Back-end, Back-end Development, Software Architecture, SQL, Architecture, REST APIs, Unix

Software Engineer

2016 - 2018
General Motors
  • Worked on five projects, where each project added around ten features to an internal web app used by electrical engineers. The web app used an AngularJS front end, Java Spring back end, and OracleDB to store data.
  • Followed the Agile methodology with three-week sprints, estimating the amount of time needed for features, and paired programming. There was a demo after every sprint.
  • Met with my business partners to gather requirements for any currently scheduled project and once every sprint to reprioritize features.
  • Mentored younger team members on web programming, coding standards, and best practices, helping with their continued growth as software engineers.
Technologies: Java, Spring, AngularJS, Windows 10, Jira, Git, Object-oriented Programming (OOP), Data Structures, Databases, JavaScript, APIs, Full-stack, Full-stack Development, Back-end, Back-end Development, Software Architecture, SQL, Architecture, REST APIs

Software Engineer

2013 - 2016
USAA
  • Acted as the subject matter expert for core customer data such as an address, email, and telephone information. Maintained UI (Wicket and JavaScript), back end (Java), and batch job (Java) codebases for core customer data.
  • Met with my business partners twice a month to go over high-level defects involving core customer data and any projects involving core customer data.
  • Mentored junior developers through paired-programming sessions, code reviews, and one-on-one meetings.
  • Won a company hackathon by helping create a phone app that estimates a person’s heartbeat to help prevent fraud. This project resulted in a patent (patent number 9953231).
Technologies: Java, JavaScript, MySQL, RabbitMQ, Windows 10, Git, Object-oriented Programming (OOP), Data Structures, Databases, APIs, Full-stack, Full-stack Development, Back-end, Back-end Development, Software Architecture, SQL, Architecture, REST APIs, Unix

Appointment Routing Engine

A Go-based application where freight loads are automatically scheduled at both pickup and dropoff based on different pieces of data such as facility hours and scheduling methods. The architecture for this project was event-based due to having to pull from multiple data sources and needing the ability to retry without having to make the same service and API calls repeatedly. Kafka was used to achieve the retry capabilities and pass off information at any given step, making the architecture event-based. Also, MySQL was used to store the history of freight loads passing through the routing engine.

The routing engine was a massive success, which led to over a 50% increase in automated freight loads, and is currently scheduling 4,000 loads a day. The percentage keeps growing, and soon a majority of freight loads will be running through the routing engine at Uber Freight.

Judge Sync Application

A Java Spring-based project that would ensure two different data sources are entirely in sync. The two different data sources for this specific project to keep in sync were a MySQL database and an IBM mainframe. Some teams at USAA only had the capability to read data from the mainframe, but the mainframe was no longer the source of truth, so a new sync process had to be made to ensure data quality.

This new process would attempt to update both MySQL and the mainframe, but unfortunately, the mainframe would sometimes fail in updating without throwing an error. Once the updates were attempted, a Kafka event would be emitted so that a consumer would pick up that message and try to compare the data between the two data sources. If the two data sources were out of sync, then another update would be made to the mainframe, and the same Kafka process would be kicked off again. If the same piece of data failed to sync three times, then a Slack message would be sent to our team, so troubleshooting could begin on why the data could not be synced to the mainframe.

Project R42

A Go-based infrastructure that would be used by the entire engineering department at American Express. I worked on two major parts within this infrastructure, the first being the streaming of purchases made by cardholders of American Express and the second being able to call various APIs to make the correct decisions on having points added to a given customer's account.

Kafka was used for moving data throughout the new infrastructure to stream the events. An internal custom Kafka library was made in Go, which could handle all the rich features available at the time, such as retries and custom headers.

Various APIs had to be hit to apply points correctly to customers' accounts to make correct decisions. Some of these APIs contained large amounts of data, so the data had to be cached if there was a failure. Of course, due to using the custom-made Kafka library, retries were automatically built into the rest of the infrastructure.

Web Scraper for Flight Data

Made a Python 3 application to scrape various airline services in an attempt to find the cheapest flights for destinations I wanted to travel to. The script was deployed to Heroku and would run every five minutes to scrape various sites. It would use various NLP libraries to gather important information such as price, dates, travel to, travel from, and layovers.
2010 - 2014

Bachelor's Degree in Computer Science

University of Arizona - Tucson, USA

Libraries/APIs

REST APIs, Zendesk API

Tools

Slack, Git, Grafana, Jira, RabbitMQ, Terraform

Languages

Go, Java, JavaScript, Python, SQL, Python 3

Paradigms

Object-oriented Programming (OOP)

Platforms

Unix, Apache Kafka, Docker, Amazon Web Services (AWS)

Storage

Databases, PostgreSQL, MySQL, IBM Mainframe, Redis

Frameworks

Spring, AngularJS

Other

Algorithms, Data Structures, Prometheus, APIs, Back-end, Back-end Development, Software Architecture, Architecture, Windows 10, Full-stack, Full-stack Development, Operating Systems, Apache Cassandra

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring