Suresh Kasipandy, Developer in Toronto, ON, Canada
Suresh is available for hire
Hire Suresh

Suresh Kasipandy

Verified Expert  in Engineering

Data Scientist and Developer

Location
Toronto, ON, Canada
Toptal Member Since
August 30, 2021

Part data scientist and part cloud solutions architect, Suresh excels at taking business problems and setting up end-to-end cloud data systems to solve them. Ranging from streaming data pipelines to data lakes to deep learning systems, Suresh leverages the latest tech and cutting-edge approaches to build robust and fault-tolerant systems that help you leverage business value from your data.

Portfolio

Pfizer - PGS Operations Insights
SQL, ETL, Data Pipelines, Python, Cypher, Neo4j, GraphDB, Data Engineering...
Foodhub
Python, Spark, Docker, Kubernetes, Redshift, Amazon S3 (AWS S3), AWS Lambda...
Chowmill, Inc.
Git, GitHub, Jira, Writing & Editing, APIs, Algorithms

Experience

Availability

Full-time

Preferred Environment

Jupyter Notebook, MacOS, Linux, Python, TensorFlow

The most amazing...

...thing I've built is a recommendation system for a food ordering app, providing a unique and personalized UX to every user.

Work Experience

Data Engineer

2021 - 2023
Pfizer - PGS Operations Insights
  • Designed and implemented end-to-end data pipeline system.
  • Contributed to data engineering and API development for several high-profile projections.
  • Involved in data modeling in relational and graph data models for both warehousing and application usage.
Technologies: SQL, ETL, Data Pipelines, Python, Cypher, Neo4j, GraphDB, Data Engineering, Data Analytics, Data Modeling, Writing & Editing, Pandas, Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNNs), Data Reporting, Artificial Intelligence (AI), APIs, Unstructured Data Analysis, Data Migration, Databases, ETL Tools, Analytics, Reporting, Healthcare, Algorithms, Generative AI, NLU, Data Manipulation, PostgreSQL, Graph Databases, Python 3, Snowflake, K-nearest Neighbors (KNN), Data Cleaning, NumPy, Docker, Microsoft Excel

Data Scientist

2018 - 2021
Foodhub
  • Built a Redshift data warehouse on AWS for analytics and reporting.
  • Developed ETL workflows using Python, Apache Airflow, Apache Spark, and AWS Glue.
  • Constructed a complete BI reporting suite using AWS QuickSight.
  • Created a chatbot solution using BotXO to automate customer service interactions.
  • Engineered streaming data pipelines from MySQL using AWS Kinesis and AWS Lambda.
  • Deployed a streaming data lake solution using AWS Kinesis, Apache Spark, AWS S3 and Apache Hudi.
  • Built a fraud detection system to detect and flag fraudulent orders.
  • Validated the POC for Segment's customer data platform (CDP) by proving business value across several verticals, including marketing, development, and operations.
  • Deployed an in-cart recommendation engine using association analysis built on AWS S3, AWS Lambda, Apache Spark, and AWS API Gateway.
  • Installed a purchase history-based recommendation engine using NLP and graph technology built on AWS Neptune (a high-performance graph database), AWS S3, AWS Lambda, Apache Spark, and AWS API Gateway.
Technologies: Python, Spark, Docker, Kubernetes, Redshift, Amazon S3 (AWS S3), AWS Lambda, MySQL, Apache Airflow, Apache Hudi, Amazon Kinesis, Amazon QuickSight, Segment, Amazon API Gateway, Amazon EC2, Amazon Elastic Container Service (Amazon ECS), Amazon EKS, Data Science, Amazon Web Services (AWS), Natural Language Processing (NLP), GPT, Generative Pre-trained Transformers (GPT), SQL, Data Engineering, Data Modeling, Data Analysis, Data Strategy, Big Data, Data Visualization, Data Analytics, Dashboards, Business Intelligence (BI), GraphDB, Writing & Editing, Deep Learning, Pandas, Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNNs), Data Reporting, Artificial Intelligence (AI), Causal Inference, ETL, APIs, Unstructured Data Analysis, Customer Journeys, User Journeys, Data Migration, Databases, ETL Tools, DevOps, Analytics, Reporting, Algorithms, NLU, Data Manipulation, Dashboard Development, PostgreSQL, Selenium, Apache Spark, PySpark, Graph Databases, Python 3, Amazon Athena, Snowflake, K-nearest Neighbors (KNN), OpenCV, You Only Look Once (YOLO), Amazon OpenSearch, Data Cleaning, NumPy, Microsoft Excel

Web Developer

2017 - 2018
Chowmill, Inc.
  • Designed and implemented a mobile app's front-end features in React Native, including the UI, scene navigations, and push notifications.
  • Spearheaded the implementation of several features, including promo codes, address entries, and a payment flow.
  • Implemented user activity tracking and event logging using Firebase.
  • Made several UX improvements that contributed to a better user journey as evidenced by user activity tracking.
  • Utilized Git, Bitbucket, and Jira to coordinate with the team on the implementation and release of new features.
  • Identified, documented, and resoloved bugs and defects.
Technologies: Git, GitHub, Jira, Writing & Editing, APIs, Algorithms

Data Analyst Intern

2016 - 2016
Triva Tek Systems
  • Created scripts for preprocessing and cleaning data for ad-hoc analysis requests using Python.
  • Built data models for new features based on requirements and business rules.
  • Analyzed historical data and then created reports on insights and trends.
  • Developed dashboards and KPI reports in Tableau to help business users monitor business efficiency.
Technologies: Python, Tableau, Data Science, SQL, Data Modeling, Data Analysis, Writing & Editing, Data Cleaning

Web Developer

2014 - 2014
Techguru
  • Developed the UX and front ends for websites based on client requirements.
  • Wrote database scripts as well as SQL stored procedures, functions, and triggers.
  • Conducted a sentiment analysis of social media and digital marketing analysis for clients and recommended improvements based on KPIs to improve website traffic and accessibility.
  • Wrote application-level code to interact with RESTful web APIs and web services using Ajax, JSON, XML, and jQuery.
Technologies: XML, JSON, SQL, Writing & Editing

End-to-end BI System

An end-to-end BI reporting system allows business users to leverage data to power their decision-making.

Using ETL pipelines (built-in Python and Pyspark), we moved transactional and operational data in daily and hourly batches to the data warehouse, making it accessible to technical users to run analytical queries on demand. We then used Amazon QuickSight to build an ecosystem of dashboards and reports to help monitor business performance metrics and KPIs, allowing business users to monitor and optimize business performance.

Customer Service Chatbot

An automated chat solution was developed using Certainly (previously BotXO).

Ted, the chatbot, was built to automate 90% of customer service interactions without compromising the quality of the customer experience. It successfully handled the increased traffic of a 400% client base expansion and eliminated the need to scale the customer service team while operating at an 80% CSAT score.

Personalized Recommendations

I have increased engagement on many projects by providing personalized user experiences using a combination of recommendation systems. These systems provided messaging to users based on their behavior, sorted search results, made suggestions based on their previous buying preferences, and recommended items at check out that were frequently bought together with cart items.

Fraud Detection System

A machine learning system that monitors all incoming orders placed for the likelihood of fraud. It flags fraudulent orders as such and sends the details to the relevant operations team to be followed up. We then set up the monitoring system to observe system performance and observe for data drift to inform when any tweaking was required.

Languages

Python, SQL, Python 3, Cypher, Snowflake, XML

Libraries/APIs

TensorFlow, Scikit-learn, PySpark, Pandas, OpenCV, NumPy

Tools

Amazon QuickSight, Git, GitHub, Jira, Tableau, Apache Airflow, Amazon Elastic Container Service (Amazon ECS), Amazon Athena, Microsoft Excel, Amazon EKS, You Only Look Once (YOLO), Amazon OpenSearch

Paradigms

Data Science, ETL, Business Intelligence (BI), DevOps

Platforms

Jupyter Notebook, Amazon Web Services (AWS), MacOS, Linux, Docker, AWS Lambda, Amazon EC2, Databricks, Kubernetes

Storage

Data Pipelines, Graph Databases, Data Lakes, Redshift, Amazon S3 (AWS S3), MySQL, Neo4j, Databases, PostgreSQL, JSON

Other

Machine Learning, Data Mining, Natural Language Processing (NLP), Data Engineering, Data Analysis, Data Analytics, GPT, Generative Pre-trained Transformers (GPT), APIs, Unstructured Data Analysis, ETL Tools, Data Manipulation, Stochastic Modeling, Organization, Time Series Analysis, Apache Hudi, Amazon Kinesis, Segment, Chatbots, Association Rule Learning, Recommendation Systems, Data Modeling, Data Strategy, GraphDB, Dashboards, Data Visualization, Big Data, Writing & Editing, Deep Learning, Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNNs), Data Reporting, Artificial Intelligence (AI), Customer Journeys, User Journeys, Data Migration, Analytics, Reporting, Algorithms, NLU, Dashboard Development, K-nearest Neighbors (KNN), Data Cleaning, User Experience (UX), Amazon API Gateway, Causal Inference, Generative AI, Delta Lake, Delta Live Tables, Production

Frameworks

Spark, Apache Spark, Selenium, Data Lakehouse

Industry Expertise

Healthcare

2015 - 2016

Master's Degree in Data Science

University of Southern California - Los Angeles, CA, United States

2010 - 2013

Bachelor's Degree in Computer Engineering

Caledonian College of Engineering - Muscat, Oman

NOVEMBER 2023 - NOVEMBER 2025

Databricks Certified Data Engineer Associate

Databricks

MARCH 2021 - MARCH 2026

Senior Data Scientist (SDS)

Data Science Council of America (DASCA)

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring