Fernando Gargiulo, Developer in Rome, Metropolitan City of Rome, Italy
Fernando is available for hire
Hire Fernando

Fernando Gargiulo

Verified Expert  in Engineering

Software Developer

Location
Rome, Metropolitan City of Rome, Italy
Toptal Member Since
November 2, 2021

Fernando is a data scientist relying on his excellent knowledge of Python and the ability to combine the predictive power of machine learning with the decision-making support of optimization and simulations techniques. Since the beginning of his activity as a software engineer and data scientist in 2015, he rapidly gained experience in DevOps and Agile development. Holding a Ph.D. in computational physics, he loves modeling reality mixing scientific rigor and creativity.

Portfolio

Enel
Python 3, Docker, Bash, Git, Testing, Scikit-learn, CI/CD Pipelines...
Enel X
Python 3, Atlassian Suite, Docker, Kubernetes, CVXPY, CPLEX, Unit Testing
Enel X
Python 3, Pandas, Scikit-learn, Docker, Hadoop, Kudu, REST, Kerberos, PyPI...

Experience

Availability

Part-time

Preferred Environment

Ubuntu Linux, PyCharm, Docker, Python 3, Flask, PostgreSQL, Scikit-learn, Atlassian Suite, Pandas, Git

The most amazing...

...project I have conceived, directed, and developed is an ecosystem of services to control smart grids and balance the electric system.

Work Experience

Head of Optimization and Operation Research

2020 - 2021
Enel
  • Expanded the group from three to eleven members, selecting talented computational/data scientists, 50% Ph.D., 50% Ms.D, from top European universities.
  • Started together with the team five macro projects, dealing with:. • electric mobility. • energy markets. • investment budget allocation. • comfort management in C&I facilities. • blockchain technology in energy sales.
  • Introduced continuous integration and deployment processes (CI/CD) powered by the Atlassian suite including Bitbucket and Bamboo, Agile methodology including Jira and Confluence, and test-driven development including Pytest, pre-commit, and merge-checks.
  • Curated and partially authored a series of data-scientist-oriented training courses covering computer science subjects including Linux, Git, Good Coding, Docker, and Kubernetes.
Technologies: Python 3, Docker, Bash, Git, Testing, Scikit-learn, CI/CD Pipelines, Atlassian Suite, Kubernetes, Amazon SageMaker

Senior Computational Scientist

2019 - 2020
Enel X
  • Conceived, implemented, and released an algorithmic service that cuts the cost by 10% and increases the revenues by up to 2.5 times in demand-response services operated through Li-ion batteries.
  • Implemented and conceived an algorithm to achieve economically optimal operation of a domestic smart-grid entailing photovoltaic generators, static storage, and charging outlets of electric vehicles.
  • Developed and deployed CI/CD pipelines for automatic and seamless release of new versions for the aforementioned algorithmic services using Bitbucket. Docker, Bamboo, and Pytest.
Technologies: Python 3, Atlassian Suite, Docker, Kubernetes, CVXPY, CPLEX, Unit Testing

Data Scientist

2017 - 2019
Enel X
  • Developed an app to serve daily forecast of several fundamental KPIs of Italian energy markets to the front-office traders, which resulted in increased yearly revenues by around €400 thousand.
  • Co-developed an app based on a neural network to forecast several fundamental KPIs of Italian energy markets to help define the monthly trading strategy. This entailed deploying substantial data ingestion, 4GB/day, the pipeline into a Kudu filesystem.
  • Helped develop a recommendation system to request optimally sized AWS EC2 instances such as virtual machines, and an online alerting system about non-optimal usage of AWS services.
  • Conceived and co-implemented a library to enable seamless and secure interaction of data science application with Kerberos-protected storage systems (e.g. Hadoop).
Technologies: Python 3, Pandas, Scikit-learn, Docker, Hadoop, Kudu, REST, Kerberos, PyPI, MongoDB, Keras

Application Specialist

2016 - 2017
NCCR MARVEL
  • Co-designed the early versions of the portal aimed at assisting scientists with the full lifecycle of computational research.
  • Developed the first version of the REST API for AiiDA, an infrastructure to assist computational research with data production, management, and sharing.
  • Contributed to building the testing infrastructure of the AiiDA infrastructure with automated pipelines (Travis.ci).
Technologies: Python, Flask, Docker, Apache, OpenStack, PostgreSQL, SQLAlchemy, Supercomputers, Travis CI, GitHub

Post-doctoral Researcher

2015 - 2016
EPFL
  • Mentored one Ph.D. student and two Ms.D students in computational physics.
  • Deployed a highly parallel application for atomistic simulations on 1024 processing cores.
  • Published three research papers in international physics journals.
Technologies: MATLAB, Supercomputers, C, Fortran, Optimization, Physics Simulations, Bash Script

Eplun

https://www.enelx.com/n-a/en/for-businesses/products/demand-response
Eplun is an ecosystem of services to manage an aggregate of distributed energy resources, such as Li-ion static batteries, photovoltaic modules, dispatchable loads, charging outlets for EVs, and more.

The core algorithm leverages the flexibility of these resources that can be geographically separated from one another to build a virtual power plant that can provide demand response service, thus releasing the electric system under stressful circumstances.

Its added value is its high degree of dynamicity and timeliness in calculating the optimal control strategy.

I co-invented the algorithm, devised the architecture of the services and how they interact with a pre-existing set of IoT cloud services in charge of managing the edge devices. I was also the main developer in the initial stages and subsequently managed a team of two other developers.

AiiDA

http://www.aiida.net
AiiDA is an open-source Python infrastructure to help researchers with automating, managing, persisting, sharing, and reproducing the complex workflows associated with modern computational science and all associated data.

I developed its REST API, helped build the testing platform, and contributed to its design.

Materials Cloud

https://www.materialscloud.org/
Materials Cloud is a web portal built to enable the seamless sharing and dissemination of resources in computational materials science, offering educational, research, and archiving tools, simulation software and services, and curated and raw data to empower data-based discovery.

I helped elaborate on the original vision of the platform, develop its back end, and the containerized deployment on OpenStack facilities.
2011 - 2015

Ph.D. Degree in Computational Physics

Ecole Polytechnique de Lausanne | Swiss Institute of Technology - Lausanne, Switzerland

2008 - 2010

Master's Degree in Theoretical Physics

University of Naples, Federico II - Naples, Italy

2004 - 2008

Bachelor's Degree in Physics

University of Naples, Federico II - Naples, Italy

JULY 2021 - PRESENT

C2 Proficiency

Cambridge University Press and Assessment

DECEMBER 2015 - PRESENT

DALF C1 French

The Ministry of Culture of France

Libraries/APIs

Pandas, Scikit-learn, SQLAlchemy, GNU Scientific Library (GSL), LAPACK, BLAS, Keras, Flask-RESTful

Tools

Atlassian Suite, Git, PyCharm, PyPI, Docker Compose, Inkscape, DataViz, MATLAB, Amazon SageMaker, CPLEX, Kudu, Apache, Travis CI, GitHub, RabbitMQ, Apache HTTP Server

Languages

Python 3, Python, Bash, C, Bash Script, Fortran, C++

Paradigms

Unit Testing, REST, Parallel Programming, High-performance Computing, Testing

Platforms

Ubuntu Linux, Docker, Kubernetes, OpenStack

Storage

PostgreSQL, MongoDB

Frameworks

Flask, Hadoop, Angular

Other

Scientific Computing, English, French, CVXPY, Optimization, Physics Simulations, CI/CD Pipelines, Kerberos, Supercomputers, Pyomo

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring