Scroll To View More
Abhijeet A Mulgund, Python 3 Developer in Houston, TX, United States
Abhijeet A Mulgund

Python 3 Developer in Houston, TX, United States

Member since December 14, 2018
Abhijeet is a data scientist and engineer with 3 years of experience working for companies of all sizes, from Google to startups. He specializes in machine learning and deep learning for natural language processing (NLP) and computer vision (CV). In addition, he is familiar with many data processing libraries. Abhijeet takes pride in his clean and maintainable Python code but has rapidly picked up languages including C++, Java, and JavaScript.
Abhijeet is now available for hire


  • Facebook
    Python, PyTorch, FastText, Spark, SQL, PHP, Hack
  • CS Disco
    Python, PyTorch, Keras, Tensorflow, CUDA
  • Google
    C++, MapReduce


  • Python 3, 5 years
  • Keras, 3 years
  • Deep Learning, 3 years
  • Machine Learning, 3 years
  • Computer Vision, 2 years
  • PyTorch, 2 years
  • Natural Language Processing (NLP), 2 years
  • TensorFlow, 2 years
Houston, TX, United States



Preferred Environment

Visual Studio Code, Python 3.6, Linux, Git

The most amazing...

...project I've built was a toxicity classifier for Wikipedia forum comments. I designed an ensemble of over 100 models, scoring 0.99 ROC AUC and 99% accuracy.


  • Software Engineering Intern

    2018 - 2018
    • Designed and implemented a complex algorithm using FastText and deep learning for detection of over 1,000 platforms and technologies used by over 5 million small and medium business websites.
    • Processed terabytes of website data using scalable and parallelized unsupervised machine learning algorithms.
    • Enabled Facebook to target previously unknown, but popular, platforms and technologies for lucrative ads partnership integrations.
    • Utilized HDBSCAN clustering algorithm to accurately and efficiently cluster website source keywords that signified newly detected platforms and technologies.
    • Built an internal web app in PHP/Hack for Facebook employees to visualize newly detected platforms and technologies along with the websites using these platforms and technologies.
    Technologies: Python, PyTorch, FastText, Spark, SQL, PHP, Hack
  • Data Science Intern

    2017 - 2018
    CS Disco
    • Prototyped state-of-the-art attentional deep learning architectures for legal NLP to help more than 400 law firms.
    • Studied and tested deep convolutional and recurrent models on the iMDb Sentiment Classification problem.
    • Deployed a Hierarchical Attention Network to classify documents in legal discovery by category and attributes.
    • Researched novel data augmentation techniques for natural language data to achieve 92% accuracy (state-of-the-art) on the iMDb Sentiment Classification problem.
    • Authored a paper detailing my novel embedding-driven data augmentation technique for natural language data.
    Technologies: Python, PyTorch, Keras, Tensorflow, CUDA
  • Software Engineering Intern

    2017 - 2017
    • Developed a MapReduce pipeline to run simulations with Google’s Dynamic Search Ads product in C++ over thousands of nodes.
    • Enabled Google’s Dynamic Search Ads to grow and improve through simulations with projected revenue growth of millions.
    • Studied TensorFlow under the developers of the library in special classes offered at Google.
    • Worked in a complex codebase of nearly 2 billion lines of code without damaging any other function of Google Dynamic Search Ads.
    Technologies: C++, MapReduce


  • Kaggle Toxic Comment Classification (Development)

    I built a superlearner ensemble of over 100 models to classify six different kinds of toxicity in Wikipedia forums using PyTorch. These models included a variety of Deep Pyramidal Convolutional Neural Networks (DPCNN), Hierarchical Attention Networks (HAN), Recurrent Neural Networks (RNN), and Naive-Bayes Support Vector Machines (SVMs) all trained with various combinations of Word2Vec, GloVe, and FastText word vectors. With this ensemble I held 1st place out of over 4000 competing teams for over a month, scoring 0.9875 ROC AUC and roughly 99% accuracy. In addition, I investigated novel data augmentation techniques using text translation and embeddings. Learning from my fellow Kaggle competitors, to augment my text data, I used the Google Translate API to translate text samples to some intermediary language, then back to English.

  • PyJet (Development)

    To help with my various PyTorch projects, I built my own custom front-end for PyTorch that behaves a lot like the library Keras on the user-side. However, the underlying implementation of the front-end is significantly more efficient and allows for use of PyTorch's API for building and designing dynamic neural networks. This library covers all aspects of the deep learning pipeline including loading and processing data, data augmentation, streamlined model design and construction, and training and inference workflows. This library is well planned and built on principles of functional and object-oriented programming. Currently, I am working on increasing my unit and integration test coverage as well as completing all documentation so I can add it to the PyTorch ecosystem.

  • Deep Learning Chess AI (Development)

    I developed an AI to play chess using my novel Dynamic Policy Net model. The model's weights are initialized with games of grandmasters from the FICS chess server and uses the REINFORCE learning algorithm to learn from self-play. The novelty of the model comes from its design, allowing it to generate a probability distribution over a variably sized action space. The overall training algorithm and design is inspired by the work of Google DeepMind on the Go-playing AI AlphaGo. My deep chess bot plays at a 1650 elo (top 4%) only looking ahead to the next move.

  • Kaggle Leaf Classification Competition (Development)

    I adapted deep learning techniques to classify images of leaves out of 99 species with only about 10 images per class. Images were binary images that contained the shape of each leaf. Since the dataset was so small with such a large number of output classes, standard convolutional networks did not train properly. Instead, I combined a convolutional network with extracted shape features from each image. In addition, I used affine transforms for image augmentation to further improve the robustness of my model. My final model placed 37th out of 1600 competitors, scoring 99.9% accuracy. My solution was featured on Kaggle’s No Free Hunch blog for its creativity and insightfulness.

  • Switchboard Binary Neural Network Research (Other amazing things)

    I researched a novel algorithm for training a non-differentiable neural network with binary activations. Due to a non-disclosure, I cannot reveal specific details about the algorithm, but the work showed promise because it paralleled much of what is observed in biological brains.

  • Char-Word2Vec (Development)

    My teammates and I investigated the use of recurrent and convolutional neural networks to generate word embeddings using character-based information. The characters were passed as a sequence into the neural encoder to produce the word vector for the given word. Once the word vector was obtained, the rest of the workflow was similar to Google's skip-gram Word2Vec model. We predicted which words occur in the same context as our given word using our word vector and the negative contrastive loss. The recurrent neural encoders performed better than skip-gram Word2Vec on tasks that required understanding morphological structure in words.


  • Languages

    Python 3, Python 2, Java, C, C++, JavaScript, SQL, PHP, HTML/CSS
  • Frameworks

    Machine Learning, Presto DB
  • Libraries/APIs

    PyTorch, Keras, NumPy, TensorFlow, Pandas, SciPy, Scikit-learn, React
  • Paradigms

    Data Science, Functional Programming, Object-oriented Programming (OOP), Agile Software Development
  • Platforms

    Linux, Visual Studio Code
  • Other

    Deep Learning, Natural Language Processing (NLP), Mathematics, Hackathons, Computer Vision, Deep Reinforcement Learning, Reinforcement Learning, Number Theory, Discrete Mathematics, HHVM
  • Storage

    Apache Hive


  • Bachelor of Arts degree in Mathematics
    2015 - 2019
    Rice University - Texas
  • Bachelor of Arts degree in Computer Science
    2015 - 2019
    Rice University - Texas
I really like this profile
Share it with others