Big Data Architect2019 - PRESENTVodafone
Technologies: Spark, Python, GeoPandas, Airflow, BigQuery, Tableau, Scala, Nifi, Beam
- Designed data pipelines for different data sources types using GCP cloud technologies.
- Developed and implemented ETL jobs using Spark.
- Transformed data sources using Spark.
- Developed and implemented analytical jobs using Spark.
- Developed and built geospatial analysis models with Spark to do parallel geoprocessing.
- Implemented and developed data pipelines to ingest data from on-premise clusters into a cloud data lake.
- Developed dashboards for businesses using Tableau.
- Worked and developed use cases on on-premise clusters.
- Migrated data and jobs from on-premise to cloud clusters.
- Designed and apply modeling for data stores to be used for reporting.
Senior Python Developer2018 - 2019Rio Tinto (via Toptal)
Technologies: Python, Django RESTful, Flask, Kafka, Redis, MongoDB, Docker, Kubernetes, Azure
- Built a data processing platform to process seismic events.
- Created a RESTful API to store and retrieve seismic data and files.
- Used Kafka as a message bus between all modules.
- Implemented Redis as a cache to store data needed to accessed frequently by the pipeline.
- Built an admin UI by Django to administer configurations and saved objects.
- Integrated the API with different processing pipeline stages to trigger sync-and-async processing of data.
- Migrated and converted a Flask API to a Django RESTful API.
- Worked with docker containerized environments for different pipeline modules.
- Worked with automated deployment pipelines on Kubernetes.
- Developed and ran components on the Microsoft Azure cloud platform.
Senior Big Data Engineer2017 - 2019Orange Business Services
Technologies: Hadoop, Spark, Nifi, Kafka, Hive, Elasticsearch, Cassandra, MongoDB, Tableau, Power BI, Azure, AWS, DataLake, S3
- Developed new business use cases with big data technologies.
- Created analytical and ETL jobs using Spark.
- Built data pipelines to ingest data into different data lakes like Azure DataLake.
- Developed new PoCs for customers to build big data platforms over cloud environments.
- Constructed a real-time monitoring platform to monitor all customers servers hosted on cloud.
- Implemented a new centralized Elasticsearch to collect metrics from all customers servers.
- Designed and built multiple dashboards for systems monitoring use cases using Tableau and Power BI.
- Developed multiple automated scripts for most day-to-day tasks.
- Handled and optimized the performance of the big data platforms.
- Managed the Hadoop clusters with all included services.
- Developed scripts and modules that automate day-to-day tasks.
- Led a squad for automation and self-monitoring activities.
- Upgraded on-premise Hadoop cluster version.
- Managed and added new nodes and disks to on-premise Hadoop.
- Installed and built the security of Hadoop clusters using Kerberos, Knox, and Ranger.
- Worked on different cloud platforms like Azure and AWS.
DWH and Campaigns Senior Developer2014 - 2017Etisalat
Technologies: Python, Java, JSF, Spring, Spark, Teradata, Oracle, SQL Server, SSIS and PrimeFaces
- Developed analysis and segmentation models to build customer profiles.
- Created offering and campaign applications to create targeted and non-targeted campaigns that reach millions of customers daily.
- Built real-time engines that serve and fulfill millions of customer requests per hour.
- Designed and developed massive complex platforms that interact with many different systems.
- Developed real-time location based advertising platform to send users offers based on their current location.
- Developed multiple data monetization solutions to be used by third-party advertisers.
- Developed and integrated the campaigns applications with many channels to empower business to reach users using there preferred channels.
- Built many web applications to empower business users to easily interact with campaigns platform.
- Designed and put architecture of DWH models for reporting and segmentations.
- Developed ETL and Integration jobs from different sources to DWH.
MIS Specialist2013 - 2014ADIB
Technologies: Business Objects, Crystal Reports, SQL, Sybase, MS SQL Server
- Designed and implemented new database models for reporting purposes.
- Developed extraction jobs and stored procedures.
- Implemented Business Objects universes and developed Business Objects reports.
- Developed custom Crystal Reports.
- Performed data transformation.
DWH Support Analyst2012 - 2013Etisalat
Technologies: Teradata, Oracle, SQL, Datastage, Teradata Warehouse Miner, Unix Shell Scripting
- Deployed and fixed issues for production ETL jobs, data mining, and analytic models.
- Developed new shell scripts for automatic monitoring and alarms for production issues.
Software Developer2011 - 2012ITS
Technologies: MS SQL Server, Oracle, Sybase, Java
- Developed new modules in core banking applications.
- Handled a full migration of the trade finance applications from Sybase to a SQL server.
- Implemented a full-service interface for a trade finance application.
- Developed custom reports using Crystal Reports.