Nizar Malkiya, Developer in Paris, France
Nizar is available for hire
Hire Nizar

Nizar Malkiya

Verified Expert  in Engineering

Data Scientist and Developer

Location
Paris, France
Toptal Member Since
May 3, 2021

Nizar is an engineer with 10+ years of experience in research, design, and implementation of data solutions. He is passionate about crafting quality software to transform large amounts of data into easily understood and useful insights. Nizar excels in all phases of this quest: data collection, visualization, and modeling; system architecture; algorithm design; and software deployment. Clients include healthcare, research, insurance, telecom, aerospace, advertising, consulting, and education.

Portfolio

Le Wagon
Python, SQL, Scraping, Statistics, Machine Learning, Deep Learning, GPT...
University of London
Python, Spark, Big Data, Hadoop, Data Science, Data Engineering
Pfizer - Manufacturing Operations Solutions
Python, SQL, Machine Learning, Data Science, Predictive Modeling, TensorFlow...

Experience

Availability

Full-time

Preferred Environment

Jupyter Notebook, Visual Studio Code (VS Code), Python, MacOS

The most amazing...

...project I've done was helping an advertising company exploit big data and machine learning techniques to optimize segmentation of prospecting campaigns.

Work Experience

Data Science Teacher

2020 - PRESENT
Le Wagon
  • Presented lectures on statistics, Python, machine learning, and deep learning.
  • Supported students during their daily exercises throughout boot camp sessions.
  • Guided students during their two-week final projects, from inception to deployment.
Technologies: Python, SQL, Scraping, Statistics, Machine Learning, Deep Learning, GPT, Natural Language Processing (NLP), Generative Pre-trained Transformers (GPT), Pandas, Scikit-learn, TensorFlow, Docker, Web Scraping, Data Science, Data Analysis, Data Scraping, Data Visualization, Google Cloud Platform (GCP), Time Series, Statistical Analysis, Data Engineering, Streamlit, REST APIs

Data Science Tutor

2020 - PRESENT
University of London
  • Provided online support to students enrolled in the big data analysis module.
  • Participated in webinars to answer questions during the big data analysis module.
  • Graded coursework and exams for the big data analysis module.
Technologies: Python, Spark, Big Data, Hadoop, Data Science, Data Engineering

Data Scientist

2021 - 2022
Pfizer - Manufacturing Operations Solutions
  • Developed a time-series-based machine learning model for predictive maintenance of industrial equipment using Python and TensorFlow for a COVID-19 project from a global pharmaceutical company.
  • Built several web apps to analyze and visualize the available sensor data with Python and Streamlit.
  • Created preprocessing routines to collect, clean, and prepare the raw data using Python.
Technologies: Python, SQL, Machine Learning, Data Science, Predictive Modeling, TensorFlow, Streamlit, Snowflake, Amazon Web Services (AWS), Amazon SageMaker, Deep Learning, Data Visualization, Data Engineering, Time Series

Data Scientist

2021 - 2021
Datafolio
  • Collected and integrated different types of data related to traffic and weather-related accidents using SQL, MongoDB, Python, and Dataiku.
  • Transformed geographical time-series data to project them on the same referential model using Python, Spark, and Dataiku.
  • Developed a road-risk model based on environmental conditions.
Technologies: Statistics, Machine Learning, Python, Pandas, Scikit-learn, SQL, MongoDB, Dataiku, Data Science, Data Analysis, Data Visualization, Statistical Analysis, Data Engineering, NoSQL

Senior Data Specialist

2017 - 2019
Roland Berger
  • Developed data analysis routines using Jupyter, Python, scikit-learn, and Keras. These included machine learning for explanation, prediction, and clustering, NLP for sentiment and topic extraction, and geographical data analysis.
  • Developed data visualization applications using JavaScript, Leaflet, and Vue.
  • Wrote Python scripts for scraping data from the web.
  • Led data training sessions for consultants, covering Dataiku, scraping, and SQL.
Technologies: Jupyter, Python, Scikit-learn, Keras, Machine Learning, Natural Language Processing (NLP), GPT, Generative Pre-trained Transformers (GPT), DataViz, JavaScript, Leaflet, Vue, Scraping, SQL, Web Scraping, Data Science, Data Analysis, Data Scraping, Deep Learning, Data Visualization, Time Series, Data Engineering, REST APIs, NoSQL

Software Engineer and Data Scientist

2015 - 2017
Ve Global UK
  • Built pipelines to store real-time data using Java, Spark, and HBase.
  • Created tools to perform user segmentation using Jupyter, Python, and Spark.
  • Developed APIs in C# to collect data generated by user interactions on the website.
Technologies: Java, Spark, HBase, Big Data, Machine Learning, APIs, Jupyter, Python, Data Science, Statistics, Data Visualization, Statistical Analysis, Data Engineering, NoSQL

Software Engineer

2014 - 2015
Be-Mobile
  • Developed metrics in C# to evaluate the quality of the traffic data produced.
  • Designed data visualization tools for the quality metrics, using JavaScript and D3.js.
  • Evaluated the performance of an alternative data storage solution in Cassandra.
Technologies: C#, Statistics, JavaScript, D3.js, Cassandra, DataViz, Data Science, Statistical Analysis, Data Engineering

Software Engineer

2012 - 2014
Institut d’Astrophysique de Paris
  • Developed tools in Python to check the data model for the science ground segment of the ESA Euclid mission.
  • Built tools in JavaScript and D3.js to visualize the data model.
  • Developed a testbed in C to assess and compare the performance of Berkley DB with Oracle and HBase storage solutions.
Technologies: Python, JavaScript, D3.js, C, Oracle, HBase, Berkeley DB, XSD, Data Visualization

Software Engineer

2011 - 2012
Sisteer
  • Co-developed Bus, a Java middleware that acted as a service bus for Sisteer's platform.
  • Developed new components for Bus based on different protocols and technologies: HTTP, FTP, XML, web services, and SQL.
  • Created graphical user interfaces for end users, using Java and Swing.
Technologies: Java, SQL, FTP, HTTP, REST APIs, SOAP

R&D Engineer

2011 - 2011
Sorbonne
  • Designed a search engine in Java, using latent semantic analysis (LSA) techniques.
  • Performed statistical analysis on a collection of one year of bank frauds, using MATLAB.
  • Designed algorithms to perform clustering on a large dataset of bank frauds, using MATLAB and Java.
Technologies: Java, MATLAB, Machine Learning, Natural Language Processing (NLP), GPT, Generative Pre-trained Transformers (GPT), Data Science

Integration Engineer

2010 - 2010
Wyde
  • Integrated and deployed numerous new software releases.
  • Identified and corrected bugs on current releases.
  • Maintained the mapping between the current software releases and the Oracle databases.
Technologies: Oracle, Batch, Software Integration, SQL

Research Engineer

2009 - 2010
ENSTA Paris
  • Measured of the transmission coefficients of the channel with a VNA.
  • Implemented signal processing and transformation, using MATLAB.
  • Performed statistical modeling (using MATLAB) of the channel’s time response as a function of different parameters, such as antenna types and positions.
Technologies: MATLAB, Statistics, Electronics, Signal Processing, Antenna Design

R&D Intern

2008 - 2008
GE Healthcare
  • Designed algorithms in C for the extraction of blood vessels from high-contrast fluoroscopic images.
  • Designed algorithms in C to register fluoroscopic images that have different contrast levels.
  • Created a POC to showcase the enhancement of fluoroscopic images in angioplasty procedures.
Technologies: C, Image Processing, Medical Imaging, Signal Processing

Vélib Hourly Visualization

https://nidata.io/vizy/velib
Vélib is a bike-sharing service in Paris.

I developed a simple web page that provides an interactive map showing the localization of bikes (Vélibs) throughout the day. This data was collected from the official Vélib web page.

Languages

Python, SQL, Java, JavaScript, C#, C, Batch, XSD, Snowflake

Frameworks

Spark, Hadoop, Streamlit

Libraries/APIs

Scikit-learn, Pandas, D3.js, Keras, Leaflet, Vue, TensorFlow, REST APIs

Paradigms

Data Science, Real-time Systems

Platforms

Jupyter Notebook, Dataiku, Visual Studio Code (VS Code), Oracle, Docker, MacOS, Amazon Web Services (AWS), Google Cloud Platform (GCP)

Storage

Databases, HBase, Berkeley DB, Cassandra, MongoDB, NoSQL

Other

Computer Science, Machine Learning, Natural Language Processing (NLP), Scraping, Web Scraping, GPT, Generative Pre-trained Transformers (GPT), Mathematics, Signal Processing, Image Processing, Deep Learning, Physics, Electronics, Automatics, Robotics, Networks, Digital Communication, Coding, RF Electronics, Security, Satellite Images, Medical Applications, Software Integration, Medical Imaging, Statistics, Antenna Design, FTP, HTTP, Big Data, APIs, Image Recognition, Data Analysis, Data Scraping, SOAP, Predictive Modeling, Data Visualization, Time Series, Statistical Analysis, Data Engineering

Tools

MATLAB, DataViz, Jupyter, Amazon SageMaker

Industry Expertise

Telecommunications

2007 - 2009

Master's Degree in Telecommunications

Télécom ParisTech - Paris, France

2006 - 2009

Master's Degree in Information and Communication Technologies

Politecnico di Torino - Turin, Italy

2002 - 2006

Bachelor's Degree in Information and Communication Technologies

Università degli Studi di Perugia - Perugia, Italy

MARCH 2018 - PRESENT

Deep Learning Specialization

Coursera

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring