Brian Seel, Developer in New York City, NY, United States
Brian is available for hire
Hire Brian

Brian Seel

Verified Expert  in Engineering

Software Engineering Developer

New York City, NY, United States
Toptal Member Since
September 29, 2022

Brian has over a decade of experience as a software developer, building tools and techniques for computer network exploitation at the National Security Agency. His core skill is his ability to solve multiple problems, from reverse engineering malware to writing data extraction tools or developing GIS maps. As a data analyst, Brian excels in helping decision-makers understand things in a data-driven way.


Sterling Data Company LLC
Web Scraping, APIs, Amazon Web Services (AWS), Data Cleaning, Data Organization...
City of Baltimore
Python 3, SQL, R, ArcGIS, GitHub, GitHub Actions, SQL Server 2017...
National Security Agency
C, C++, C++11, PyCharm, Agile, Jenkins, Data, Networks, Ruby, Jira, Shell...




Preferred Environment


The most amazing...

...thing I have developed was a covert data transmission protocol using DNS packets.

Work Experience

Senior Developer

2023 - 2023
Sterling Data Company LLC
  • Wrote a sales system that brought in data from various election data sources, enriched it with other data sources, and used that to populate an internally used Hubspot database.
  • Created an extensive test suite that included mocked unit tests and end-to-end integration tests to verify the quality of the data that was being generated.
  • Worked with various API endpoints to push and pull data, including APIs that were not fully documented. I wrote Python wrappers that managed the interaction with the APIs so that they could be interacted with in Pythonic ways from the core project.
Technologies: Web Scraping, APIs, Amazon Web Services (AWS), Data Cleaning, Data Organization, HubSpot, Python, Python 3, Pytest

Data Analyst and Developer

2020 - 2022
City of Baltimore
  • Reverse-engineered data provider's websites to use Python to scrape data from behind authenticated accounts into agency data stores.
  • Developed dashboards and maps in PowerBI and ArcGIS to provide at-a-glance portals for decision-makers to track agency metrics focused on the problems the agency was trying to solve at that time.
  • Led data-centric stat meetings to review data, develop monthly action items, and identify new metrics to track.
  • Wrote code that maintained highly available data and monitored data integrity issues to identify failures in the agency dataflows.
Technologies: Python 3, SQL, R, ArcGIS, GitHub, GitHub Actions, SQL Server 2017, Microsoft Power BI, PyCharm, SQLAlchemy, Mypy, Tox, CI/CD Pipelines, pylint, Git, Data Analytics, Data Analysis, Data Visualization, Analytics, Analysis, Python API, Selenium, GIS, JSON, Python, Automation Scripting, APIs, Docker, Pandas, REST APIs, Continuous Delivery (CD), Continuous Integration (CI), Data Modeling, ETL, React, Front-end, Data Engineering, API Integration, Data Science, API Documentation, Geospatial Data, Geographic Information Systems, Amazon DynamoDB

Software Developer

2016 - 2019
National Security Agency
  • Created a client (written in Python) and a server (written in C++) that allowed for communication with Windows clients in situations of high latency, high packet loss, and low network reliability.
  • Developed capabilities to enable computer network exploitation efforts, primarily by developing C++ codebases with testing scripts in Python.
  • Explored and developed against the internals of Windows.
  • Used collaborative development tools, such as SVN, Jenkins, Jira, Confluence, and Crucible, to enable our Agile team.
Technologies: C, C++, C++11, PyCharm, Agile, Jenkins, Data, Networks, Ruby, Jira, Shell, Win32 API, Reverse Engineering, x64 Assembly, Assembly, Penetration Testing, Windows Internals, Subversion (SVN), Python 3, Python 2, Python, Python API, CNE, JSON, Automation Scripting, APIs, Windows, Pandas, Continuous Delivery (CD), Continuous Integration (CI), Data Engineering, API Integration, API Documentation

Software Developer

2015 - 2016
National Security Agency
  • Developed Ruby-based plugins on top of IBM's Streaming Analytics platform to support ingesting massive amounts of agency data for use by analysts.
  • Led a team of 12 developers, testers, DevOps, and writers operating on a two-week sprint cycle that worked to reduce a backlog of technical debt while also continuing to add customer-required functionality.
  • Supported additional functionality and troubleshooting of IBM Streaming Analytics written in Ruby and C++.
Technologies: Ruby, IBM InfoSphere Streams, Jenkins, Agile, C++, CNE, Scrum, Automation Scripting, APIs, Windows, Data Engineering

Software Developer

2009 - 2015
National Security Agency
  • Developed a network protocol to transmit sensitive data through overt channels.
  • Built a Python-based fuzzer to find vulnerabilities in commercial off-the-shelf telecom hardware.
  • Used collaborative development tools, such as SVN, Jenkins, Jira, Confluence, and Crucible, to enable our Agile team.
  • Implemented a Windows driver that altered network packets before a software intrusion detection system could inspect them.
  • Reverse-engineered binary files to determine what they were designed to do and patch out capabilities that needed to be bypassed.
  • Became proficient in tailored access operations tools for network navigation, tactical forensic analysis, and collection of valuable intelligence information.
  • Performed analysis of various operating system security configurations, packet analysis, port scanning, and vulnerabilities.
Technologies: C++, C, Reverse Engineering, Windows Driver Kit (WDK), Assembly, Assembler x86, Hacking, Network Exploitation, CNE, CISSP, Metasploit, Fuzz Testing, Python, Python 2, Python 3, Subversion (SVN), Confluence, Jira, Crucible, Agile, Scrum, Intrusion Detection Systems (IDS), Packet Communication, Digital Forensics, Vulnerability Assessment, Linux, HTML, JSON, Automation Scripting, MySQL, APIs, Technical Writing, Data Engineering


2013 - 2013
Montgomery College
  • Taught introductory Python to classes of 15-20 students, many of who had no experience writing code.
  • Developed lesson plans and homework based on the college's general class structures and goals.
  • Reinforced lecture topics through an ongoing class project that added skills learned from class.
Technologies: Python, Training, Education

Baltimore City has a challenging relationship with its transit provider, requiring multiple workarounds to complete proper reporting. The city needs the data to understand how the system operates and for federally mandated reports. This project scrapes data from the provider's website, parses various spreadsheets with dirty data, and cleans it up for internal use in an agency database.

I also built a Python wheel for our data provider's undocumented API, which required a bit of reverse engineering to understand how it worked. The code is available here:

The scraper:

Bike Share Tracker
This project was less impressive from a technical perspective, but it made a difference in my city. Baltimore had a bike share system with chronic issues, and the operator was unwilling to admit that the problems were as big as they were. I wrote this script to regularly pull the number of available bikes in the city, which showed that the numbers were quickly falling to levels well below the service level agreement.

I used that data to write a published article highlighting the problems, which resulted in the system being shut down three weeks later.


Public Bus Tracker
After our local bus system opened its real-time tracking data to the public, a few friends and I wrote a system to track, store, and visualize the data that was becoming available. This project was necessary because the data API only shared real-time data and not historical data. Throughout two days, three of us used Elasticsearch, Logstash, Kibana, and AWS to track that data and better analyze the historical trends.

An additional write-up is available here:

Some examples of the data were published here:

Remote File Collection
This project was part of a social engineering challenge where the goal was to exfil a file from the desktop of a target. The game had another challenge that involved drawing bubbles on the screen, so I delivered a Python script that was supposed to solve that problem while doing the file collection. The code connects to an IRC channel that exfils the data to a file called "flag.txt" on the user's desktop. It can be extended to accept commands and maintain persistence.

Automated Traffic Violation Enforcement System
After Baltimore's Automated Traffic Violation Enforcement System was shut down in 2015, there was a higher level of scrutiny on the system when it was brought back in 2017. ATVES had data spread across many different providers, different vendors administered the red light and speed camera programs, and the financial data was stored separately.

This project aimed to bring the data together to be tracked on internal dashboards, which required three different scrapers and a data cleaning before being brought into our internal database. It required three scrapers to pull the data from our Red Light Camera vendor, our Speed Camera vendor, and our financial information. The financial information was behind a Windows-authenticated website, which required extensive reverse engineering and patching an external library to get it to work.

Dashboard Mailer
One of our team's struggles was that we would develop dashboards, and some users wouldn't have permission to use them or forget to check them. The customer requested that we send screenshots of some of the dashboards, so I built this tool to take screenshots and email them to the relevant people automatically.

This project used Selenium to take the screenshots and the SMTP library to email the screenshots out.

Ticket Data Extraction
Our vendor of parking citations had their data available to authenticated users but did not have a good way to provide the data on a rolling basis to us. This scraper pulls down the data and puts it in a database for later use.
2009 - 2012

Master's Degree in Offensive Computer Security

Eastern Michigan University - Ypsilanti, MI, USA

2004 - 2008

Bachelor's Degree in Computer Science

University of Idaho - Moscow, ID, USA

AUGUST 2013 - AUGUST 2017

GIAC Certified Penetration Tester (GPEN)

Global Information Assurance Certification (GIAC)

MAY 2012 - DECEMBER 2014

Certified Information Systems Security Professional



ArcGIS, SQLAlchemy, Mypy, Python API, Pandas, Win32 API, REST APIs, React


PyCharm, Shell, Microsoft Power BI, Git, GitHub, pylint, Subversion (SVN), Metasploit, Confluence, Jenkins, Jira, ELK (Elastic Stack), GIS, Windows Driver Kit (WDK), Crucible, Pytest


C++, C++11, Python 3, SQL, Python, C, R, Python 2, HTML, Ruby, x64 Assembly, Assembly, Assembler x86


Penetration Testing, Scrum, ETL, Agile, Security Software Development, Fuzz Testing, Continuous Integration (CI), Data Science, Continuous Delivery (CD)


Windows, Linux, Amazon Web Services (AWS), IBM InfoSphere Streams, Docker, HubSpot

Industry Expertise

Network Security


JSON, SQL Server 2017, MySQL, Amazon DynamoDB




Software Engineering, Data, Networks, Hacking, Computer Security, Web Scraping, Data Cleaning, Tox, Data Visualization, CNE, Automation Scripting, APIs, Data Engineering, Operating Systems, Windows Internals, Operational Security (OPSEC), Cryptography, Access Control, GitHub Actions, CI/CD Pipelines, Data Analytics, Data Analysis, CISSP, Intrusion Detection Systems (IDS), Packet Communication, Digital Forensics, Vulnerability Assessment, DAX, Data Modeling, API Integration, API Documentation, Geospatial Data, Geographic Information Systems, Reverse Engineering, Analytics, Analysis, Network Exploitation, Technical Writing, Training, Education, Front-end, Data Organization

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.


Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring