Yunus Yünel
Verified Expert in Engineering
AWS Solution Architect, Data Engineer, and DWH Developer
Yunus is a data engineer who sees data as Lego. It is meaningless without other pieces, and when put together, something valuable is created. He excels with automation, data warehouse, BI, and big data projects. He cares about modularity in scripts and delivers bespoke solutions for clients. Yunus developed a loyalty score project and moved a semi-structured mainframe log file to a big data environment with PySpark for a bank and maintained around 7,000 jobs (mostly Ab Initio) in the ETL system.
Portfolio
Experience
Availability
Preferred Environment
Google Cloud Platform (GCP), Amazon Web Services (AWS), Python, SQL, Apache Airflow, Apache Hive, ETL, Apache Beam, Spark, Apache Kafka
The most amazing...
...system I implemented was a centralized data delivery infrastructure for sharing knowledge between teams and reducing similar tasks.
Work Experience
Senior Data Engineer
GfK - Growth from Knowledge
- Oversaw new product development for running machine learning algorithms.
- Built a science platform for data scientist teams.
- Designed and implemented new data ingestions for the data scientist team.
- Created impact analysis report for operation team. Prepared dashboards in Google Data Studio for their analysis.
- Gathered data from different on-premises databases into GCP and prepared reports of the journey of the production time.
Consultant
Data Reply
- Prepared the data platform for regulatory requirements with using Ab Initio. (GDE, BRE, EME) and Oracle DB.
- Maintained three existing and two new Cloudera Hadoop Clusters with Ansible scripts. Kept clusters up-to-date with the latest versions. Secured clusters with Kerberos, TLS/SSL, and Sentry.
- Added new services to the Cloudera Cluster (Cloudera Data Science. Workbench, Sentry) and installed new modules (Spark).
Data Engineer
Hopi
- Designed and implemented new projects in the data warehouse with Talend and Informatica.
- Maintained data flow from Kafka to Hive and Oracle and PostgreSQL with. Storm and HiveQL.
- Maintained hourly, daily, and monthly jobs in Informatica, Python, and HiveQL.
DWH and Big Data Developer
Kredi Kayıt Bürosu (Credit Bureau)
- Designed and implemented new projects in a data warehouse system with Informatica.
- Developed a loyalty score project. Calculated a score based on how loyal customers were to their banks.
- Designed and implemented a Factoring Datamart. Communicated report needs and built end-to-end projects.
- Moved a semi-structured mainframe log file to a big data environment with PySpark.
BI Consultant – Certified Ab Initio Technician
i2i Systems
- Designed and developed an automated data migration and data masking tool in Ab Initio. Gathered metadata from Ab Initio EME and a data masking algorithm implementation with PDL.
- Redesigned and re-engineered a revenue Datamart with PL/SQL. Re-implemented Ab Initio graphs in PL/SQL.
- Maintained ~7,000 job (mostly Ab Initio) ETL system.
- Improved the performance of tasks in Ab Initio and PL/SQL.
- Led a team with five colleagues. Managed team communication with customers and upper management. Participated in the hiring process of this and other teams in the company.
Experience
Automated Data Migration and Data Masking Tool
Science Platform
Education
Master's Degree in Engineering Management
Bahcesehir University - Istanbul, Turkey
Bachelor's Degree in Mathematics Engineering
Istanbul Technical University - Istanbul, Turkey
Certifications
Certified SAFe 4 Practitioner
Scaled Agile
CCA Administrator
Cloudera
AWS Solution Architect Certificate - Associate
AWS
Ab Initio Technician
Ab Initio
Skills
Libraries/APIs
PySpark
Tools
Apache Airflow, Ab Initio, Git, Cloudera, RabbitMQ, Jenkins, Ansible, Impala, Apache ZooKeeper, Amazon Elastic MapReduce (EMR), Amazon Elastic Container Registry (ECR), Amazon Simple Notification Service (Amazon SNS), Amazon Simple Queue Service (SQS), BigQuery, Apache Beam, Composer
Paradigms
ETL, Agile
Storage
Databases, Oracle PL/SQL, Apache Hive, Microsoft SQL Server, PostgreSQL, Data Pipelines, Google Cloud, HBase, Amazon S3 (AWS S3), Redshift, Google Cloud Storage
Languages
SQL, Python, Java
Platforms
Oracle, Google Cloud Platform (GCP), Amazon Web Services (AWS), Talend, Apache Kafka, Hortonworks Data Platform (HDP), Kubernetes, Docker, Amazon EC2
Industry Expertise
Project Management
Frameworks
Spark, YARN, Hadoop, Scaled Agile Framework (SAFe)
Other
Analytical Thinking, Data Modeling, Algorithms, Data Transformation, Data Engineering, Data, ETL Development, Data Warehousing, Data Warehouse Design, Informatica, Software Development, Google BigQuery, Catalog Data Entry Services, Data Architecture, Google Data Studio, Human Resources (HR), Entrepreneurship, Data Mining, Finance, Data Encryption, Metadata, Kerberos, SSL Certificates, Pub/Sub, Google Pub/Sub, Scrum Master
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.
Start hiring