Mohamed Ahmed, Developer in Berlin, Germany
Mohamed is available for hire
Hire Mohamed

Mohamed Ahmed

Verified Expert  in Engineering

Big Data Architect and Back-end Developer

Berlin, Germany

Toptal member since November 4, 2022

Bio

Mohamed is a big data platform architect with 15 years of experience in the IT industry. He excels with distributed systems, data engineering, machine learning, and DevOps. Mohamed builds robust batch and real-time data platforms for stakeholders and moves companies to state-of-the-art data platforms. He designed GDPR-compliant data lakes and optimized an ETL pipeline, which saved the client hundreds of thousands of dollars annually. Mohamed is pragmatic and has an agile mindset.

Portfolio

Mobile.de
Big Data Architecture, Scala, Apache Kafka, BigQuery, Cloud Engineering, Python...
Careem Networks FZ
Big Data Architecture, Apache Airflow, Scala, Python, Apache Livy...
Searchmetrics gmbh
AWS, RabbitMQ, Big Data Architecture, Apache Zeppelin, Java, Apache Kafka...

Experience

Availability

Part-time

Preferred Environment

Linux, Slack, Jira, GitHub, Google Cloud Platform (GCP), Apache Kafka, Data Engineering, Amazon Web Services (AWS), Big Data, Big Data Architecture

The most amazing...

...project I've delivered required tuning a microservice to serve over 100,000 requests per second in 20 milliseconds per request.

Work Experience

Big Data Architect

2020 - PRESENT
Mobile.de
  • Designed, led, and guided the company data platform migration from on-premises infrastructure to public cloud (GCP) and saved 30% of stakeholders' daily business hours.
  • Built and designed a data lake that complies with GDPR requirements with minimal effect on downstream users.
  • Planned and monitored the GCP budget across dozens of projects, which resulted in an accurate spend cost.
Technologies: Big Data Architecture, Scala, Apache Kafka, BigQuery, Cloud Engineering, Python, Delta Lake, Data Processing, Apache, GDPR, Apache Airflow, Google Cloud SQL, Google Cloud Functions, Kubernetes, Catalog Data Entry Services, Apache Cassandra, PostgreSQL, Cloud Run, Apache Flume, Linux, Spark Streaming, MongoDB, Data Engineering, Database, ETL, ELT, SQL, Jira, GitHub, Redshift, Pub/Sub, Data Modeling, Data Architecture, Spark, User-defined Functions (UDF), Data Science, Data Strategy, Big Data Architecture, Big Data Architecture, Databricks, Snowflake, Architecture, Data-driven Design, NoSQL, Database, Technical Program Management, AWS Glue, Apache Maven, Solution Architecture, Data Build Tool (dbt), Google BigQuery, Cloud Migration, Data Migration, Google Cloud Development, PySpark, DevOps, Infrastructure as Code (IaC), Data Analysis, Data Lake Design, Google Cloud Storage

Staff Data Engineer

2018 - 2020
Careem Networks FZ
  • Developed an ETL framework that automates data processing pipelines and runs hundreds of ETL daily.
  • Optimized an ETL pipeline, which saved hundreds of thousands of dollars yearly.
  • Established a data academy to help full-stack engineers grow their data engineering knowledge.
Technologies: Big Data Architecture, Apache Airflow, Scala, Python, Apache Livy, Jupyter Notebook, Apache Zeppelin, Amazon Elastic MapReduce (EMR), Hadoop, Presto, Apache, Data Processing, Linux, Data Engineering, Database, ETL, ELT, Amazon S3, Data Structures, SQL, Deep Learning, Jira, GitHub, Data Modeling, Data Architecture, Spark, User-defined Functions (UDF), Big Data Architecture, Pandas, AWS, AWS Lambda, Big Data Architecture, Architecture, Data Integration, Database Modeling, NoSQL, AWS, Database, Apache Maven, Solution Architecture, EMR, PySpark, AWS Certified Solution Architect, Amazon Virtual Private Cloud (VPC), Machine Learning Operations (MLOps), DevOps, Amazon Kinesis, Data Lake Design

Senior Software Engineer and Big Data

2017 - 2018
Searchmetrics gmbh
  • Designed and developed a challengeable data pipeline for billions of messages and records.
  • Devised the ETL framework to work with many sources and sinks.
  • Presented new technologies and discussed them with my team.
Technologies: AWS, RabbitMQ, Big Data Architecture, Apache Zeppelin, Java, Apache Kafka, Spark Streaming, Apache, MySQL, Hadoop, Hadoop, Scala, Spring Boot, Data Processing, Linux, Data Engineering, Database, ETL, ELT, Amazon S3, Data Structures, Amazon Elastic MapReduce (EMR), SQL, Jira, GitHub, Data Modeling, Spark, User-defined Functions (UDF), Big Data Architecture, Data Integration, Microservices Development, NoSQL, Database, Apache Maven, EMR, RESTful Microservices, AWS Certified Solution Architect, Amazon EC2, Amazon Virtual Private Cloud (VPC), DevOps, Data Lake Design

Senior Software Engineer and Big Data

2015 - 2017
Agoda
  • Developed and tuned recommendation microservices to accept millions of requests per second with a success rate of 99.99 in 20 milliseconds for the whole request trip.
  • Designed and developed a reactive DAG framework to build any logical flow over Akka actors and futures.
  • Assisted the data scientist team with ETL pipelines to apply ML offline training.
Technologies: Big Data Architecture, Hadoop, Scala, Akka, Data Processing, Apache Cassandra, PostgreSQL, Linux, Data Engineering, Database, ETL, ELT, Data Structures, SQL, Jira, Django, HDFS, Data Modeling, Spark, User-defined Functions (UDF), Big Data Architecture, APIs, Data Integration, Microservices Development, Database, Apache Maven, Hadoop, RESTful Microservices, Machine Learning Operations (MLOps)

Back-end Specialist

2014 - 2015
CIT global
  • Developed a logging service that tracks all app actions on MongoDB with AspectJ.
  • Developed an e-payment workflow using Mule ESB that controls payment steps.
  • Created a wallet payment microservice that transfers payments across bank accounts.
Technologies: Java, MongoDB, JAX-WS, EJB3, Hibernate, Oracle Database, SQL, Apache Cassandra, Stored Procedure, APIs, Microservices Development, Database, Apache Maven, RESTful Microservices, Back-end Developers

Senior Java Developer

2010 - 2014
E-Finance
  • Developed back-end and front-end payment services using multiple frameworks; ADF, Struts, and ICEfaces.
  • Created business reports using the Jasper Reporting tool.
  • Built and automated administration pages created from a DB ER diagram.
Technologies: Oracle ADF, Apache Struts, EJB3, JPA 2, JAX-WS, Oracle Development, Quartz, Oracle Database, Ajax, SQL, Web Services, Data Modeling, Stored Procedure, APIs, Apache Maven, Back-end Developers

Service Information Developer

2009 - 2010
HP Inc
  • Built the endpoint of the sales (EPOS) client app validator using Servlet and JSP, which can validate big XML files and return invalid tags.
  • Wrote a user tutorial that guided users to new features and increased the customer acceptance rate.
  • Contributed to the internal development community that helped new users get familiar with internal tools.
Technologies: Servlet, JAX-RPC, Java API for XML Processing (JAXP), Hibernate, Ajax, Web Services, JAX-WS, APIs, Back-end Developers

Java Developer

2007 - 2009
Networks Valley
  • Created a custom payroll desktop app that handled complicated payroll logic and generated company payroll reports.
  • Devised an innovative home service that monitored smart homes and sent mobile notifications to homeowners.
  • Built a PCL interface app that controlled devices in an electricity plant.
Technologies: Servlet, Hibernate, Ajax, SQL Server, Java, SQL, JPA 2, Back-end Developers

On-premises to Public Cloud (GCP) Migration

We experienced various limitations in the private cloud, so we decided to move to the public cloud (GCP). I took the initiative to design, lead, and build migration across the company and be an excellent example for new data platform infrastructure.

I collected and discussed pain points with stakeholders, created general architecture ADR, reviewed the new design with my team and stakeholders, and collected feedback. Next, I estimated the budget and discussed it with the head of technology, removed obstacles to implementation, and modified open-source frameworks to fit our needs; for example, I added a new feature to the Atlas data catalog framework to support delta-lake. I reviewed the road map with my team and broke it down into epics and parallel stories, jumped in to help when blocks arose, and discussed best practices with different teams in the company from a data point of view.

Real-time Analytics Service

This real-time analytics tool extracts user-tracking metrics from event streams depending on configurable input using Kafka and Spark-streaming frameworks.

I designed a real-time solution that fulfilled stakeholders' requirements, tuned the reader service to achieve <100 milliseconds latency in the 99.99 success rate percentile, and introduced network solutions as the project ran in a hybrid cloud environment.

Building Marketplace Data Platform

I worked across teams in three countries to define common pain points and introduce new solutions. This included creating the POC/RFC for new solutions, discussing solutions with teams, collaborating with teams and project managers to set the execution plan, mentoring data academy members, and building standard tools that accelerate development time.

Keyword Ranking

I designed and developed innovative stream and batch data processing projects to improve the ranking of our clients with millions of keywords for many search engines in many countries with many languages.

I created a changeable data pipeline for billions of messages and records and designed the ETL framework to work with many sources and sinks. I presented new technologies and discussed them with the team. I tuned the jobs to fit our cluster, reviewed the code, and took ownership of the project.

Reactive Framework (Jarvis)

Jarvis is a reactive DAG framework to build logical flow over Akka actors and futures. I reviewed the requirements with all teams involved to simplify and remove repeated functions. I designed and built a DAG solution to streamline and service our business logic units in reactive mode. I presented the solution and helped teams use it.

Hotel Recommendation

I designed and developed an innovative project to rank hotels based on user preferences. I assisted data scientists in collecting the data to apply the offline training and designed and built a solution replicating the ALS model to five data centers. I created a distributed and local cache solution to hold the ML model in memory. This achieved a four-millisecond response time in the worst-case scenarios. I developed a load balancer between servers in the same data center, applied our DAG framework (Jarvis) to build our ranking service, and tuned the microservice to accept millions of requests with a success rate of 99.99. I then audited the customer interaction with the microservice to use it for model evaluation and configured the deployment scripts for production and stage servers.

Bidding Channel ROI Manager

This is a back-end project to manage the bidding channel jobs for companies such as Google and TripAdvisor. I reviewed the design with the software architect, and built the dynamic implementation for channels, accounts, sync data, and the Oozie and HDFS clients. I built the migration scripts and configured the deployment environment for production.
2002 - 2007

Bachelor's Degree in Electrical Engineering

Fayoum University - Egypt

DECEMBER 2018 - PRESENT

Algorithms on Graphs

Coursera

JUNE 2018 - PRESENT

Deep Learning Specialisation

Coursera

FEBRUARY 2017 - PRESENT

Data Structures

Coursera

JANUARY 2017 - PRESENT

Algorithmic Toolbox

Coursera

JULY 2014 - PRESENT

OCE Java EE 6 EJB 3.x (1Z0-895)

Oracle

MAY 2014 - PRESENT

OCE Java EE 6 Web Service (1Z0-897)

Oracle

NOVEMBER 2013 - PRESENT

OCE Java Persistence API 2.0 - EE 6 (1Z0-898)

Oracle

MAY 2008 - PRESENT

Sun Certified Web Component Developer SCWCD 5 (310-083)

Sun Microsystems

AUGUST 2006 - PRESENT

Sun Java5 Certified SCJP 5 (310-055)

Sun Microsystems

Libraries/APIs

PySpark, JPA 2, Spark Streaming, JAX-WS, JAX-RPC, Pandas, Quartz, Java API for XML Processing (JAXP)

Tools

Apache Airflow, Apache Maven, Apache, BigQuery, Slack Development, Jira, GitHub, Apache Zeppelin, Servlet, Amazon Elastic MapReduce (EMR), Hadoop, RabbitMQ, Apache Beam, Amazon EKS, AWS Glue, Amazon Virtual Private Cloud (VPC)

Languages

Java, Scala, SQL, HTML, Python, Snowflake, C, C#, VHDL, Pascal, Stored Procedure

Frameworks

Big Data Architecture, Spark, Hadoop, Akka, Presto, Spring Boot, Hibernate, Oracle ADF, Apache Struts, Django

Paradigms

ETL, Data-driven Design, DevOps, Microservices Development

Platforms

Cloud Engineering, Apache Kafka, AWS, Linux, Jupyter Notebook, Apache, Oracle Database, AWS Lambda, Amazon EC2, Kubernetes, Cloud Run, Databricks

Storage

Database, NoSQL, Google Cloud Development, Data Lake Design, Google Cloud Storage, Google Cloud SQL, MySQL, PostgreSQL, Hadoop, MongoDB, Oracle Development, SQL Server, HDFS, Amazon S3, Redshift, Data Integration, AWS, Database, Database Modeling

Other

Data Processing, GDPR, Data Engineering, ELT, Data Architecture, Big Data Architecture, Big Data Architecture, Architecture, Solution Architecture, Google BigQuery, Cloud Migration, Data Migration, Data Analysis, Ajax, Web Services, Algorithms, Data Structures, Graph Algorithms, Delta Lake, Google Cloud Functions, Catalog Data Entry Services, Apache Cassandra, Apache Livy, EJB3, Atlas, Pub/Sub, Data Modeling, User-defined Functions (UDF), Data Science, Data Strategy, APIs, EMR, RESTful Microservices, AWS Certified Solution Architect, Back-end Developers, Machine Learning Operations (MLOps), Microprocessors, Microcontroller Programming, Java Card OpenPlatform (JCOP), Deep Learning, Apache Flume, Technical Program Management, Data Build Tool (dbt), NVivo, Infrastructure as Code (IaC), Amazon Kinesis

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring