Hanee' Medhat Shousha, Big Data Developer in Cairo, Cairo Governorate, Egypt
Hanee' Medhat Shousha

Big Data Developer in Cairo, Cairo Governorate, Egypt

Member since May 13, 2016
Hanee' is a data expert who enjoys working on data analytics and segmentation to better target customers with campaigns. He is also an experienced Java developer who has built enterprise applications that interact with millions of customers daily. Hanee' also has some experience working with big data, Spark, and Python.
Hanee' is now available for hire

Portfolio

  • Vodafone
    Spark, Python, GeoPandas, Airflow, BigQuery and Tableau
  • Rio Tinto (via Toptal)
    Python, Django RESTful, Kafka, Redis, MongoDB, Kubernetes
  • Orange
    Hadoop, Spark, Elasticsearch, Cassandra, MongoDB, Tableau

Experience

  • SQL, 5 years
  • Python, 4 years
  • Apache Spark, 4 years
  • Big Data, 4 years
  • Apache Kafka, 2 years
  • Tableau, 1 year
  • MongoDB, 1 year
  • Machine Learning, 1 year

Location

Cairo, Cairo Governorate, Egypt

Availability

Full-time

Preferred Environment

OS X, Linux, Git

The most amazing...

...project that I've implemented is a platform that optimized campaigns scripts by analyzing user responses to identify the best way to interact with the users.

Employment

  • Big Data Architect

    2019 - PRESENT
    Vodafone
    • Design data pipelines for data sources.
    • Develop and implement ETL jobs.
    • Develop Transformation jobs using Spark.
    • Develop analytical jobs using PySpark.
    • Develop dashboards for business using Tableau.
    Technologies: Spark, Python, GeoPandas, Airflow, BigQuery and Tableau
  • Python Developer

    2018 - 2019
    Rio Tinto (via Toptal)
    • Built a data processing platform to process seismic events.
    • Created a RESTful API to store and retrieve seismic data and files; also to trigger different processing pipelines.
    • Used Kafka as a message bus between all modules.
    • Implemented Redis as a cache to store data needed to accessed frequently by the pipeline.
    • Built an admin UI by Django to administer configurations and saved objects.
    Technologies: Python, Django RESTful, Kafka, Redis, MongoDB, Kubernetes
  • Senior Big Data Engineer

    2017 - 2019
    Orange
    • Developed new use cases with big data technologies.
    • Constructed new POCs for customers to build big data platforms over cloud environments.
    • Implemented a new centralized Elasticsearch to collect metrics from all customers' servers.
    • Designed and built multiple dashboards for systems monitoring use cases.
    • Applied and deployed new changes and also fixed current incidents on big data platforms.
    • Handled and optimized the performance of the big data platforms.
    • Managed the Hadoop cluster with all included services.
    • Developed transformation jobs using spark.
    Technologies: Hadoop, Spark, Elasticsearch, Cassandra, MongoDB, Tableau
  • DWH and Campaigns Senior Analyst

    2014 - 2017
    Etisalat
    • Analyzed and segmented customer profiles.
    • Added new customer attributes on a campaign management application.
    • Developed new platforms and applications that interact with the customers with different channels.
    • Built new ad hoc integrations between campaign management tools and new channels.
    • Created jobs for real-time campaigns to target users based on specific events.
    Technologies: Teradata, SQL, Java, Spring, IBM Streams, Campaign Management Tools
  • MIS Specialist

    2013 - 2014
    ADIB
    • Designed and implemented new database models for reporting purposes.
    • Developed extraction jobs and stored procedures.
    • Implemented Business Objects universes and developed Business Objects reports.
    • Developed custom Crystal Reports.
    • Performed data transformation.
    Technologies: Business Objects, Crystal Reports, SQL, Sybase, MS SQL Server
  • DWH Support Analyst

    2012 - 2013
    Etisalat
    • Deployed and fixed issues for production ETL jobs, data mining, and analytic models.
    • Developed new shell scripts for automatic monitoring and alarms for production issues.
    Technologies: Teradata, Oracle, SQL, Datastage, Teradata Warehouse Miner, Unix Shell Scripting
  • Software Developer

    2011 - 2012
    ITS
    • Developed new modules in core banking applications.
    • Performed a full migration of the trade finance applications from Sybase to a SQL server.
    • Implemented a full-service interface for a trade finance application.
    • Developed custom reports using Crystal Reports.
    Technologies: MS SQL Server, Oracle, Sybase, Java

Experience

Skills

  • Languages

    SQL, Java, Python, C++
  • Frameworks

    Apache Spark, Spring
  • Tools

    Tableau, Cloudera, Google Cloud Dataproc, Apache Impala, Apache Sqoop, Apache Avro, Qlik Sense, Grafana, IBM Infosphere (Datastage), Crystal Reports, Kibana, GIS
  • Paradigms

    ETL, Business Intelligence (BI)
  • Platforms

    Apache Kafka, Hortonworks Data Platform (HDP), Unix, Google Cloud Platform (GCP), Oracle
  • Storage

    Teradata, Apache Hive, PostgreSQL, MongoDB, MySQL, HBase, Microsoft SQL Server, Sybase, Elasticsearch, Cassandra, Datastage
  • Other

    Big Data, Data Warehousing, Aprimo, Parquet, Machine Learning, Google BigQuery, SAP BusinessObjects (BO), Prometheus, Apache Flume, GeoPandas
  • Libraries/APIs

    Pandas, NumPy, D3.js, Charts.js

Education

  • Diploma degree in Business Intelligence and Software Development
    2010 - 2011
    Information Technology Institute - Cairo, Egypt
  • Bachelor of Engineering degree in Computer Engineering
    2005 - 2010
    Benha University - Banha, Egypt
Certifications
  • CCA Spark and Hadoop Developer CCA175
    DECEMBER 2017 - DECEMBER 2019
    Cloudera

To view more profiles

Join Toptal
I really like this profile
Share it with others