Abhijeet A Mulgund, Developer in Houston, TX, United States
Abhijeet is available for hire
Hire Abhijeet

Abhijeet A Mulgund

Verified Expert  in Engineering

Machine Learning Developer

Location
Houston, TX, United States
Toptal Member Since
February 1, 2019

Abhijeet is a data scientist and engineer with 3 years of experience working for companies of all sizes, from Google to startups. He specializes in machine learning and deep learning for natural language processing (NLP) and computer vision (CV). In addition, he is familiar with many data processing libraries. Abhijeet takes pride in his clean and maintainable Python code but has rapidly picked up languages including C++, Java, and JavaScript.

Availability

Part-time

Preferred Environment

Python, Git, Linux, Visual Studio Code (VS Code), Integrated Circuits, Circuit Design, Verilog HDL

The most amazing...

...project I've built was a toxicity classifier for Wikipedia forum comments. I designed an ensemble of over 100 models, scoring 0.99 ROC AUC and 99% accuracy.

Work Experience

Software Engineering Intern

2018 - 2018
Facebook
  • Designed and implemented a complex algorithm using FastText and deep learning for detection of over 1,000 platforms and technologies used by over 5 million small and medium business websites.
  • Processed terabytes of website data using scalable and parallelized unsupervised machine learning algorithms.
  • Enabled Facebook to target previously unknown, but popular, platforms and technologies for lucrative ads partnership integrations.
  • Utilized HDBSCAN clustering algorithm to accurately and efficiently cluster website source keywords that signified newly detected platforms and technologies.
  • Built an internal web app in PHP/Hack for Facebook employees to visualize newly detected platforms and technologies along with the websites using these platforms and technologies.
Technologies: Hack, PHP, SQL, Spark, fastText, PyTorch, Python

Data Science Intern

2017 - 2018
CS Disco
  • Prototyped state-of-the-art attentional deep learning architectures for legal NLP to help more than 400 law firms.
  • Studied and tested deep convolutional and recurrent models on the iMDb Sentiment Classification problem.
  • Deployed a Hierarchical Attention Network to classify documents in legal discovery by category and attributes.
  • Researched novel data augmentation techniques for natural language data to achieve 92% accuracy (state-of-the-art) on the iMDb Sentiment Classification problem.
  • Authored a paper detailing my novel embedding-driven data augmentation technique for natural language data.
Technologies: NVIDIA CUDA, TensorFlow, Keras, PyTorch, Python

Software Engineering Intern

2017 - 2017
Google
  • Developed a MapReduce pipeline to run simulations with Google’s Dynamic Search Ads product in C++ over thousands of nodes.
  • Enabled Google’s Dynamic Search Ads to grow and improve through simulations with projected revenue growth of millions.
  • Studied TensorFlow under the developers of the library in special classes offered at Google.
  • Worked in a complex codebase of nearly 2 billion lines of code without damaging any other function of Google Dynamic Search Ads.
Technologies: MapReduce, C++

Kaggle Toxic Comment Classification

https://github.com/abhmul/toxic-comments
I built a superlearner ensemble of over 100 models to classify six different kinds of toxicity in Wikipedia forums using PyTorch. These models included a variety of Deep Pyramidal Convolutional Neural Networks (DPCNN), Hierarchical Attention Networks (HAN), Recurrent Neural Networks (RNN), and Naive-Bayes Support Vector Machines (SVMs) all trained with various combinations of Word2Vec, GloVe, and FastText word vectors. With this ensemble I held 1st place out of over 4000 competing teams for over a month, scoring 0.9875 ROC AUC and roughly 99% accuracy. In addition, I investigated novel data augmentation techniques using text translation and embeddings. Learning from my fellow Kaggle competitors, to augment my text data, I used the Google Translate API to translate text samples to some intermediary language, then back to English.

PyJet

https://github.com/abhmul/PyJet
To help with my various PyTorch projects, I built my own custom front-end for PyTorch that behaves a lot like the library Keras on the user-side. However, the underlying implementation of the front-end is significantly more efficient and allows for use of PyTorch's API for building and designing dynamic neural networks. This library covers all aspects of the deep learning pipeline including loading and processing data, data augmentation, streamlined model design and construction, and training and inference workflows. This library is well planned and built on principles of functional and object-oriented programming. Currently, I am working on increasing my unit and integration test coverage as well as completing all documentation so I can add it to the PyTorch ecosystem.

Deep Learning Chess AI

https://github.com/abhmul/DeepJetChess
I developed an AI to play chess using my novel Dynamic Policy Net model. The model's weights are initialized with games of grandmasters from the FICS chess server and uses the REINFORCE learning algorithm to learn from self-play. The novelty of the model comes from its design, allowing it to generate a probability distribution over a variably sized action space. The overall training algorithm and design is inspired by the work of Google DeepMind on the Go-playing AI AlphaGo. My deep chess bot plays at a 1650 elo (top 4%) only looking ahead to the next move.

Kaggle Leaf Classification Competition

https://www.kaggle.com/abhmul/keras-convnet-lb-0-0052-w-visualization
I adapted deep learning techniques to classify images of leaves out of 99 species with only about 10 images per class. Images were binary images that contained the shape of each leaf. Since the dataset was so small with such a large number of output classes, standard convolutional networks did not train properly. Instead, I combined a convolutional network with extracted shape features from each image. In addition, I used affine transforms for image augmentation to further improve the robustness of my model. My final model placed 37th out of 1600 competitors, scoring 99.9% accuracy. My solution was featured on Kaggle’s No Free Hunch blog for its creativity and insightfulness.

Char-Word2Vec

https://github.com/abhmul/576FinalProject
My teammates and I investigated the use of recurrent and convolutional neural networks to generate word embeddings using character-based information. The characters were passed as a sequence into the neural encoder to produce the word vector for the given word. Once the word vector was obtained, the rest of the workflow was similar to Google's skip-gram Word2Vec model. We predicted which words occur in the same context as our given word using our word vector and the negative contrastive loss. The recurrent neural encoders performed better than skip-gram Word2Vec on tasks that required understanding morphological structure in words.

Switchboard Binary Neural Network Research

I researched a novel algorithm for training a non-differentiable neural network with binary activations. Due to a non-disclosure, I cannot reveal specific details about the algorithm, but the work showed promise because it paralleled much of what is observed in biological brains.

Languages

Python 3, Verilog HDL, Python 2, Python, Hack, CSS, Java, C, C++, JavaScript, SQL, PHP, HTML

Libraries/APIs

PyTorch, Keras, NumPy, TensorFlow, Pandas, SciPy, Scikit-learn, React

Paradigms

Data Science, Functional Programming, Object-oriented Programming (OOP), MapReduce, Agile Software Development

Platforms

Linux, Visual Studio Code (VS Code), NVIDIA CUDA

Other

Machine Learning, Deep Learning, Natural Language Processing (NLP), Mathematics, Hackathons, Software Development, Integrated Circuits, Circuit Design, GPT, Generative Pre-trained Transformers (GPT), Computer Vision, Deep Reinforcement Learning, Reinforcement Learning, Number Theory, Discrete Mathematics, fastText, HHVM

Frameworks

Spark, Presto

Tools

Git

Storage

Apache Hive

2015 - 2019

Bachelor of Arts Degree in Mathematics

Rice University - Texas

2015 - 2019

Bachelor of Arts Degree in Computer Science

Rice University - Texas

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring