Jesse Moore, Data Science Developer in Calgary, AB, Canada
Jesse Moore

Data Science Developer in Calgary, AB, Canada

Member since June 4, 2019
Jesse is an experienced data scientist who solves complex problems. He co-founded an NLP company that was acquired in 2018, and has consulted at some of the best technology companies in Europe, such as Zalando, and MariaDB. Jesse's core programming language is Python and he has several years of experience with R. His expertise is in NLP, in the construction and optimization of machine learning models, and in visualizing big data.
Jesse is now available for hire


  • Voiceops
    Document Processing, Custom BERT, Data Science, Deep Learning...
  • Mobilads
    Data Science, Amazon Web Services (AWS), Pandas, AWS, Shapely, GeoPandas, Python
  • Relu Analytics
    Document Processing, Custom BERT, Web Scraping, Data Science, Deep Learning...



Calgary, AB, Canada



Preferred Environment

GitHub, Jupyter Notebook, Sublime Text, Linux

The most amazing...

...thing I've built is a deep-learning algorithm to detect market moving events in the stock market. We were able to help hedge funds reduce portfolio volatility.


  • Deep Learning Engineer

    2019 - PRESENT
    • Developed the architecture and construction of AWS-based infrastructure for large-scale machine learning. VoiceOps is an AI-driven coaching and training platform for call-centers.
    • Introduced and delivered models to support the transcription process. This included developing scripts to pre-train, fine-tune, and fully integrate transformers (e.g., BERT, various HuggingFace Transformers) into novel new architectures that included both text and statistical data.
    • Built a modified transformer to automatically score the quality of transcriptions and determine whether they should pass to the client (ROC-AUC = 0.90).
    • Created a modified transformer that automated the detection of speakers based on text (ROC-AUC = 0.97).
    • Automated the estimation of how long a transcript would take to transcribe to replace a fixed-price system (cost savings of 20-30% of total transcription costs).
    • Improved Automated Speech Recognition (ASR) via Seq2Seq architectures.
    Technologies: Document Processing, Custom BERT, Data Science, Deep Learning, Amazon Web Services (AWS), PyTorch, Torch, Natural Language Processing (NLP), Pandas, Machine Learning, AWS, Python, Keras, Fairseq, Artificial Intelligence (AI)
  • Chief Technology Officer

    2019 - PRESENT
    • Constructed and optimized a geospatial system that maps physical ad impressions based on vehicle GPS data and mobile GPS data. The Mobilads geospatial system was successfully built to operate worldwide and built to scale to thousands of vehicles and billions of GPS points.
    • Developed automated reporting systems for the clients of Mobilads to demonstrate the technology.
    • Built up the company's IP portfolio through the integration of census, geotracking, and social data to enrich what Mobilad's knows about the people that see their vehicles. This ensures consistent industry-leading return on ad-spend.
    • Architected and led the development of the Mobilads app for autonomously managing tens of thousands of drivers.
    Technologies: Data Science, Amazon Web Services (AWS), Pandas, AWS, Shapely, GeoPandas, Python
  • Founder, CEO, Principal Consultant

    2016 - PRESENT
    Relu Analytics
    • Consulted as the senior data scientist at Step Energy Services. Built algorithms for optimizing the use of fixed equipment (extended maintenance, failure prediction, forecasting, and budgeting), as well as cash flow prediction.
    • Collaborated with the senior leadership team of Cinelytics to build scalable NLP pipelines. Provided code samples and walked through the software engineering team on building and deploying deep learning models in the capacity of a data scientist at Cinelytic.
    • Designed an end-to-end machine learning application using Google Cloud to serve as an API for the front-end team in order to deliver predictions via the companies UI. Consulted as the data scientist at Meditalente GMBH.
    • Consulted as the senior data scientist for a fixed project at MariaDB (non-disclosable).
    Technologies: Document Processing, Custom BERT, Web Scraping, Data Science, Deep Learning, Amazon Web Services (AWS), PyTorch, Torch, Natural Language Processing (NLP), Pandas, Machine Learning, AWS, TensorFlow, Keras, Scikit-learn, Python, Artificial Intelligence (AI)
  • CEO (Previously Chief Data Scientist)

    2017 - 2019
    • Led a team of 15 data scientists, linguists, software engineers, product managers, and sales professionals.
    • Focused primarily on deep learning for text classification with Keras and Tensorflow, and its integration within a rule-based NLP system.
    • Developed an out-of-memory document clustering system to allow the clustering of billions of news articles.
    • Built a natural language processing (NLP) system that rivaled the best NLP companies in finance, and led to data trials with some of the largest fund managers.
    • Led and oversaw the Newsful application ( that was shortlisted for the 2018 SIIA CODiE Award. The business operations were acquired by Commetric (
    Technologies: Document Processing, Custom BERT, Data Science, Deep Learning, Amazon Web Services (AWS), Torch, Natural Language Processing (NLP), Pandas, Machine Learning, AWS, R, TensorFlow, Keras, Python, Artificial Intelligence (AI)
  • Contract Data Scientist

    2017 - 2018
    • Designed an end-to-end machine learning application using Google Cloud to serve as an API for the front end team.
    • Matched candidates with businesses by utilizing staff demographic data, historical job data, and interview transcripts.
    Technologies: Data Science, Pandas, Machine Learning, Google Cloud Platform (GCP), Python, Artificial Intelligence (AI)
  • Data Scientist

    2016 - 2018
    • Built analytical tools and ETL pipelines in Spark on AWS.
    • Built predictive tools for targeting audiences for specific ad campaigns.
    • Developed interactive data applications for product owners using Python and R (Shiny) to automate time-consuming analysis tasks (customer journeys, return on ad spend).
    • Developed a system to optimize how ads are placed within the search and recommendation engine to reduce lost revenue due to poor ad placement by up to $0.5 million USD per month.
    • Designed a system for determining the causal impact of multiple concurrent ad campaigns (off-site, on-site, banner Ads, and full-page ads) using regression and Bayesian time-series models.
    Technologies: Amazon Web Services (AWS), Data Science, AWS, Pandas, Machine Learning, Scikit-learn, R, Spark, Python


  • NHL Systematic Betting

    The goal for this project from the outset was to predict the outcome of hockey games and attempt to find systematic errors in betting odds by traditional providers such as Bet365 and Pinnacle.

    The system began to approach parity in capability with the best provider in January 2019, and surpass it in February.

    Put into production in March 2019 and successfully traded throughout the following 12 months, the system has returned over 900% to investors.

  • Newsful

    Sigmai’s proprietary NLP technology enabled our customers to be the first to act on breaking news and allowed us to build an unbelievable archive of corporate history.

    Newsful was a demonstration of that technology, and was shortlisted for a CODiE Award in the Best Business Intelligence Reporting & Analytics category.

  • Sigmai: Skynet Natural Language Processing System

    Lead author of a proprietary end-to-end deep-learning-based NLP system for classifying text.

    Achieved state-of-the-art results for entity-based classification (classifying text as it relates to a specific entity in the text).

  • BERT in Natural Language Processing (Talk)

    I explain "Using BERT To Accelerate Natural Language Processing". Data collection is burdensome, time-consuming, expensive, and the number one limiting factor for successful NLP projects. Preparing data, building resilient pipelines, making choices amongst hundreds of potential preparation options, and getting "model ready" can easily take months of effort even with talented machine learning engineers. Finally, training and optimizing deep learning models require a combination of intuitive understanding, technical expertise, and an ability to stick with a problem. Google's BERT (Bidirectional encoder representations from transformers) form transfer learning that requires far less data and training time. Building world-class models become the possibility of any data scientist, rather than solely the domain of large companies with large budgets.

  • For a successful natural language processing project, collecting and preparing data, building resilient pipelines, and getting "model ready" can easily take months of effort even with the most talented engineers. But what if we could reduce the data required to a fraction? In this article, we’ll cover how transfer learning is making world-class models open source and introduce BERT (bidirectional encoder representations from transformers). BERT is the most powerful NLP “tool” to date. We’ll explore how it works and why it will change the way companies execute NLP projects.


  • Languages

    Python 3, Python, SQL, Bash
  • Libraries/APIs

    Scikit-learn, PyTorch, Keras
  • Paradigms

    Data Science
  • Industry Expertise

    Project Management
  • Other

    Document Processing, Custom BERT, Artificial Intelligence (AI), Regression, Classification, Machine Learning, Deep Learning, Natural Language Processing (NLP), Feature Engineering, Time Series Analysis, Data Cleaning, AWS, Fairseq, Web Scraping, Ensemble Methods, Attribution Modeling, Product Management, Data Visualization
  • Platforms

    Amazon Web Services (AWS), Linux, Google Cloud Platform (GCP)
  • Tools



  • Bachelor's Degree in Mechanical Engineering
    2007 - 2012
    University of Alberta - Edmonton, Alberta, Canada

To view more profiles

Join Toptal
Share it with others