Olcay Ozyilmaz, Developer in Istanbul, Turkey
Olcay is available for hire
Hire Olcay

Olcay Ozyilmaz

Verified Expert  in Engineering

Bio

Olcay is a data engineer with a PhD in computer science and 15 years of professional experience with extensive projects. He is a data warehouse (DWH) expert and a developer who focuses on productivity using robotics. He also has superior communication, decision-making, and organizational expertise backed by a master's degree in business management. Olcay is an excellent team player and leader who undertakes challenging jobs with his outstanding analytical and problem-solving skills.

Portfolio

Ashish Bansal
Data Engineering, Python, Data Pipelines, ETL, Data Warehousing, Snowflake...
Zoetis - Main
Alteryx, Azure Data Factory, Azure Databricks, SQL Server 2016, Tableau, Azure...
Vodafone Turkey
Shell, Linux, Python, REST, Jenkins, Java, Oracle, Toad, Bitbucket, Jira...

Experience

Availability

Full-time

Preferred Environment

Linux, Abinitio, Python, SQL, Docker, Git, Machine Learning, Amazon Web Services (AWS), Jenkins, Atlassian, Data Warehouse Design

The most amazing...

...thing I've done as a DWH expert is build ETL and process workflows and develop DevOps CI/CD pipelines for Vodafone and SoftTech DWH reengineering projects.

Work Experience

Senior Data Engineering Consultant

2023 - 2023
Ashish Bansal
  • Reviewed the customer's ETL pipelines. I provided consultancy for the products they use and industry standards.
  • Provided consultancy for DWH standards like naming standards and coding standards.
  • Designed the most efficient solution alternatives for some of the problems they had difficulty solving.
Technologies: Data Engineering, Python, Data Pipelines, ETL, Data Warehousing, Snowflake, Airbyte

Technical ETL Documentation Writer

2022 - 2023
Zoetis - Main
  • Extracted information programmatically from a messy architecture. There were no coding standards or documentation, and I generated well-designed diagrams and documents.
  • Inspected 9,000+ ETL pipelines in over 10 ETL platforms, including ADF, data bricks, Alteryx, SQL servers, MySQL servers, clouds, and blobs. Extracted the information from 4000+ Pipelines, 22,000+ data sources, and 30+ ETL and storage platforms.
  • Assisted them in reengineering their Alteryx and Data Factory pipelines to data bricks notebooks.
  • Extracted all the business inside legacy ETL pipelines. The existing architecture lacked standardization or documentation with many different ETL platforms, codes, and scripts, including Alteryx, ADF, adb, SQL, bat, shell, and Python.
Technologies: Alteryx, Azure Data Factory, Azure Databricks, SQL Server 2016, Tableau, Azure, Azure Data Lake, Python, Docker, Development, Big Data, Metadata, Productivity, Architecture, Data Warehousing, PL/SQL, Jupyter Notebook, Windows PowerShell, REST, Microsoft SQL Server, Process Design, Automation, Testing, Engineering, Regular Expressions, Regex, Scripting, CSV File Processing, CSV, Web Scraping, Microsoft Excel, APIs, Quality Control (QC), Reports, Data Pipelines, ELT, Quality Improvement, PostgreSQL, Docker Compose, Data Analysis, Data Analytics, Pipelines, Technical Writing, Databases, Data Integration, API Integration, REST APIs, Apache Spark, Consulting, Azure Data Explorer, Data Management, Data Lakes, Databricks, Data Lakehouse, Spark, Relational Databases, Big Data Architecture, Data Transformation, Technical Architecture, Monitoring, Agile, T-SQL (Transact-SQL), Database Design, OLAP, OLTP, SQL Stored Procedures, PySpark, Writing & Editing, Actuarial, Data Warehouse Design, Entity Relationships

DevOps Engineer & Architect

2022 - 2023
Vodafone Turkey
  • Created ETL workflow and processes and designed and developed a DevOps CI/CD pipeline flow for DWH.
  • Monitored Agile boards and improved delivery quality and speed by feeding Jira with automations, which also helped to increase Jira usage.
  • Improved workflow quality by developing a formal definition of "done and ready" and preparing a responsibility assignment matrix for each task in the flow.
  • Handled Bitbucket, Jira, UC4, and Confluence administration.
Technologies: Shell, Linux, Python, REST, Jenkins, Java, Oracle, Toad, Bitbucket, Jira, Jira REST API, Jira Administration, Bitbucket API, Confluence, Automic (UC4), Oracle ODI, Oracle Data Integrator (ODI), Data Migration, Migration, DevOps, Jenkins Pipeline, Process Design, Process Flow Design, Architecture, Automation, Regex, Regular Expressions, Docker, Git, SQL, Atlassian, ETL Tools, Development, Engineering, Algorithms, Big Data, ETL, Data Engineering, Data Governance, Metadata, Productivity, Scripting, CSV, Microsoft Excel, APIs, Data-level Security, Management, Business Intelligence (BI), BI Reports, Reports, Data Visualization, Data Warehousing, Oracle PL/SQL, Data Pipelines, ELT, Pipelines, GitHub, Data, NumPy, Databases, Technical Writing, PostgreSQL, Data Integration, API Integration, Postman, REST APIs, Consulting, Data Architecture, Data Management, Data Lakehouse, Relational Databases, Data Transformation, Technical Architecture, Monitoring, Agile, Data Analysis, Data Modeling, Database Design, Performance Tuning, Dynamic SQL, OLAP, SQL Stored Procedures, Writing & Editing, Actuarial, Data Warehouse Design, Entity Relationships

Data Engineer | QA Manager | DevOps Engineer

2021 - 2022
SoftTech
  • Contributed to the Boubyan Bank–SoftTech DWH reengineering project.
  • Created a process flow to eliminate responsibility gaps in the project.
  • Developed a DevOps pipeline to reduce rework costs and improve delivery quality.
  • Documented the test and deployment strategies and designed the end-to-end project flow.
  • Built Jira-integrated test and deployment automation tools and helped standardize and increase Jira usage through simplified reports and management.
  • Increased delivery speed up to 30 times by clearly defining what is "done and ready" in the project and implementing a CI/CD pipeline.
  • Developed an Informatica deployment automation pipeline. Developed BO deployment automation. Developed SQL deployment automation.
  • Created test execution automation. Developed defect management on Jira. According to the test automation results, the defects were created automatically.
  • Contributed to all automation triggered via Jira transitions. It also improved the Jira usage quality and a user-friendly interface for the team.
  • Developed Power BI deployment automation. Defined dashboard and report test strategy and scope.
Technologies: CI/CD Pipelines, Jenkins, Jenkins Pipeline, Jira, Jira REST API, Confluence, Git, Bitbucket, Linux, Shell, Windows PowerShell, Python, REST, Microsoft SQL Server, Architecture, Process Flows, DevOps, DevOps Engineer, Process Design, Process Flow Design, Atlassian, Regex, Regular Expressions, Docker, SQL, ETL Tools, Development, Engineering, Algorithms, ETL, Data Engineering, Data Governance, Metadata, Productivity, Scripting, CSV File Processing, CSV, Informatica, Informatica ETL, Microsoft Power BI, Microsoft Excel, APIs, Data-level Security, Management, Business Intelligence (BI), Dashboards, BI Reports, Reports, Data Visualization, Data Warehousing, Oracle, Data Pipelines, ELT, Pipelines, GitHub, Data, NumPy, SQL Server Integration Services (SSIS), Databases, Technical Writing, PostgreSQL, Data Integration, API Integration, Postman, Consulting, Data Architecture, Data Management, Data Lakehouse, Relational Databases, Data Transformation, Technical Architecture, Monitoring, Agile, Data Analysis, T-SQL (Transact-SQL), Data Modeling, Database Design, Performance Tuning, Data Migration, Dynamic SQL, OLAP, SQL Stored Procedures, Writing & Editing, Actuarial, Data Warehouse Design, Entity Relationships

Data Engineer | QA Lead | Migration Manager | DevOps Engineer

2019 - 2022
Deloitte Legal International
  • Contributed to the Vodafone and Deloitte DWH reengineering project.
  • Built ETL workflow and processes, designed and developed the DevOps CI/CD pipeline and migration flows, and performed DWH modeling.
  • Managed Agile teams, reviewed designs and code, and attended architectural design meetings.
  • Developed the definition of "done and ready," improving productivity and delivery quality.
  • Created formal documents for handovers and managed Jira, Confluence, and Bitbucket repositories and boards.
  • Designed solutions to functional and non-functional requirements.
  • Participated in different areas of the project. Developed migration flows for legacy systems to re-engineer DWH. Analyzed the usage domain and performed data modeling.
Technologies: Jira, Jira REST API, Jira Administration, Confluence, Bitbucket, Bitbucket API, Automic (UC4), Oracle, Teradata, Data Modeling, Architecture, Oracle ODI, Oracle Data Integrator (ODI), Abinitio, Data Governance, Jenkins, Jenkins Pipeline, Linux, Shell, Python, Toad, Java, Oracle SQL Developer, SQL, PL/SQL, PL/SQL Tuning, PL/SQL Developer, Oracle PL/SQL, Migration, ETL, ETL Tools, Data Engineering, Testing, DevOps, CI/CD Pipelines, Agile, Team Management, Atlassian, Automation, Data Migration, Development, Deployment, Regex, Regular Expressions, Docker, Git, Engineering, Algorithms, Big Data, Metadata, Productivity, Scripting, CSV, Microsoft Excel, APIs, Data-level Security, Management, Business Intelligence (BI), BI Reports, Reports, Data Visualization, Data Warehousing, Data Pipelines, ELT, Data Analysis, Pipelines, GitHub, Data, Databases, Technical Writing, PostgreSQL, Data Integration, API Integration, Postman, Consulting, Data Architecture, Data Management, Data Lakehouse, Relational Databases, Data Transformation, Technical Architecture, Monitoring, Database Design, Performance Tuning, Dynamic SQL, OLAP, SQL Stored Procedures, Writing & Editing, Actuarial, Data Warehouse Design, Entity Relationships

Senior ETL Developer

2016 - 2019
Turkcell
  • Reviewed designs and code and developed many productivity tools and automation.
  • Implemented the ETL process using Oracle, Ab Initio, Unix, and shell scripts.
  • Contributed to projects such as individual profitability, customer segmentation, BTK systemic review quality regularity reports, customer loyalty analysis, Siebel integration, DecisionHub, and subscription domain reengineering.
  • Developed a big data ETL using Ab Initio GDE and handled ETL maintenance, performance tuning, and resource monitoring.
  • Built continuous data workflows for streaming. I also developed near-real-time ETL (mini-batches) and ETL flows (batch ETL).
  • Optimized an ETL scheduler and tuned the performance of critical path jobs.
  • Developed several automation tools and frameworks for the development of specific requirements. With these frameworks, junior developers can easily create perfect ETL flows with a few configurations.
  • Built frameworks for almost all parts of classical DWH architecture: extraction, CDC, load, recycle management, dimension management (type# tables), and key management.
  • Re-engineered a subscription domain while working on government-related data and error-critical projects.
  • Developed testing automation to simplify test processes and increase productivity. I coded automation to generate some documents like release notes.
Technologies: Abinitio, Big Data, SQL, Toad, Oracle, PL/SQL, PL/SQL Tuning, Oracle PL/SQL, PL/SQL Developer, Linux, Shell, Productivity, Regex, Regular Expressions, Machine Learning, Atlassian, ETL Tools, Development, Apache Kafka, Kafka Streams, Engineering, Algorithms, ETL, Data Engineering, Data Governance, Metadata, Scripting, CSV File Processing, CSV, Microsoft Excel, APIs, Data-level Security, Business Intelligence (BI), Reports, Data Visualization, Data Warehousing, Data Pipelines, ELT, Data Structures, Pipelines, Pandas, GitHub, Data, NumPy, Databases, Technical Writing, Data Integration, API Integration, REST APIs, Data Architecture, Data Management, Data Lakes, Databricks, Data Lakehouse, Relational Databases, Big Data Architecture, Data Transformation, Technical Architecture, Monitoring, Agile, Industrial Internet of Things (IIoT), Data Modeling, Database Design, Performance Tuning, Data Migration, Dynamic SQL, OLAP, SQL Stored Procedures, PySpark, Writing & Editing, Actuarial, Data Warehouse Design, Entity Relationships

Senior Analytics Consultant

2011 - 2016
i2i Systems
  • Engaged as a data engineer and Turkcell DWH roadmap team member.
  • Contributed to roadmap projects, including developing the Turkcell roadmap's ETL.
  • Developed productivity tools, including automations for the development itself.
Technologies: Abinitio, Oracle, PL/SQL, PL/SQL Tuning, Linux, Shell, Regex, Regular Expressions, SQL, Atlassian, ETL Tools, Development, Engineering, Algorithms, Big Data, ETL, Data Engineering, Productivity, Scripting, CSV File Processing, CSV, Microsoft Excel, APIs, Data-level Security, Business Intelligence (BI), Tableau, Reports, Data Visualization, Data Warehousing, Oracle PL/SQL, Data Pipelines, ELT, Data Structures, Pipelines, Pandas, GitHub, Data, NumPy, Databases, Technical Writing, Data Integration, API Integration, REST APIs, Apache Spark, Consulting, Data Architecture, Data Management, Data Lakes, Databricks, Data Lakehouse, Relational Databases, Big Data Architecture, Data Transformation, Monitoring, Data Modeling, Database Design, Performance Tuning, Data Migration, Dynamic SQL, OLAP, OLTP, SQL Stored Procedures, Writing & Editing, Actuarial, Data Warehouse Design, Entity Relationships

Developer

2010 - 2011
Pause Interactive
  • Engaged with this digital agency startup as a full-stack developer at the request and invitation of my university professors.
  • Performed web development and database modeling for a marketing platform.
  • Built mobile applications and a developer API, handled social network integrations, and implemented the OAuth protocol.
Technologies: X (formerly Twitter) API, Facebook API, BlackBerry, Mobile App Development, HTML, CSS, JavaScript, JavaScript Libraries, C#.NET, C#, Microsoft SQL Server, Google API, Google Maps, Maps, Bing API, Bing Maps, HTML5 Geolocation, Google Maps API, SQL, Development, Engineering, Algorithms, Productivity, Scripting, CSV File Processing, CSV, Microsoft Excel, APIs, GitHub, Data, Databases, Technical Writing, Data Integration, API Integration, REST APIs, Relational Databases, Data Transformation, Data Modeling, Database Design, OLAP, OLTP, SQL Stored Procedures, Entity Relationships

Robotics Developer

2009 - 2009
Ford Otosan
  • Programmed robots which produce cars in the Ford Gölcük factory.
  • Tested fixture robots and developed new robot requirements.
  • Participated in robotic development training sessions and improved producer robots' performance.
Technologies: Development, Engineering, Algorithms, Productivity, Scripting, Technical Writing, Consulting, Data Transformation, OLAP, Entity Relationships

QA Management Roles

I contributed as a QA manager and lead on a few DWH reengineering projects, reviewing and improving development, deployment, test, migration, and reconciliation strategy documents.

I prepared QA strategy documents and defined DoD (definition of done) and DoR (definition of ready) for all the tasks involved in these strategies, mapping them to eliminate quality gaps. I also prepared and reviewed RACI matrixes for all tasks, eliminating responsibility gaps.

Besides attending exco meetings, I also developed end-to-end CI/CD pipelines for DWH that allowed to:
* Validate the outputs and inputs of each task
* Increase productivity
* Reduce rework costs
* Measure all the details in the processing pipeline (If you can not measure, you can not improve.)
* Integrate all the automation to Jira, helping to increase Jira usage and quality and allowing integration with different task management tools
* Develop frameworks for ETL pipeline development so less experienced ETL developers can perform configurative development

Developer Frameworks and Automation

I developed multiple frameworks and automation for the different projects to help:

• Increase productivity
• Improve output quality and standardization
• Simplify the work of junior developers
• Decrease re-work costs

Some of the DWH-related developer frameworks that simplified the development to a few configurations through metadata-driven development include:

• An extraction framework
• A CDC framework
• Dimension management frameworks (Type# Dim management)
• A load framework and partition maintenance
• Data clean frameworks
• Recycle management frameworks
• Automatic scheduler definition
• Test case create automation (according to model standards)
• Test automation
• Defect management
• Operational alert management (SMTP integration)
• A reconciliation checker
• A document generator for release notes
• Coder codes to generate some bulk codes for various purposes

Vodafone DWH Reengineering Project

As the QA lead, I participated in all design and architectural meetings, including with the project management office and executive committee. I designed DevOps and defined test and deployment strategies. I also reviewed all the strategy documents and work streams. In addition, I established the project's definition of "done and ready" and planned each task's responsibility assignment matrix in the detailed strategy documents.

ANKA | Help Coordination Information System

This project won fourth place in Microsoft Imagine Cup 2009, software design and development division. I built the geolocation-based application and performed database modeling. I also developed the web application with Bing and Google Maps integration.

Boubyan Bank DWH Reengineering Project

I managed the end-to-end reengineering project delivery. I participated in all architectural and management meetings, including with the executive committee. I also designed the DevOps, process flows, and testing and deployment strategies.

Social Network Integrated Semantic Applications

I developed numerous Facebook, Twitter, TradingView, Binance, and map applications. I employed full-stack development and data modeling and built web and mobile applications with social network and map integrations.

Productivity Tool | Graph Builder for Ab Initio

I designed and built productivity tools to generate developed flows for common architectures, including solutions for recycle, type 1 and 2, extraction, delta, load, schedule, and test data management.

Productivity Tool | Test Case Creator

I developed a test case generator using modeling name standards. With this tool, I created numerous test scenarios, such as constraint checks, profiling, and integrity tests. I could also detect over 100 defects in four hours using this tool and test automations.

Productivity Tool | Deployment and Documentation Manager

I developed a tool that automates the deployment and creation of release documents on Confluence according to a formal template. The project included automated UC4, database, Oracle Data Integrator, and Informatica deployments.
2021 - 2024

PhD in Computer Science

Istanbul Ticaret University (Istanbul Commerce University) - Istanbul, Turkey

2010 - 2013

Master's Degree in Business Management

Beykent University - Istanbul, Turkey

2006 - 2010

Bachelor's Degree in Computer Science

Istanbul Ticaret University (Istanbul Commerce University) - Istanbul, Turkey

APRIL 2023 - APRIL 2026

AWS Certified Solutions Architect – Associate

Amazon Web Services Training and Certification

SEPTEMBER 2018 - PRESENT

AWS Machine Learning Engineer Nanodegree

Udacity

SEPTEMBER 2014 - PRESENT

Oracle SQL Expert Certification

Oracle University

SEPTEMBER 2012 - PRESENT

Ab Initio Technician

Ab Initio

SEPTEMBER 2010 - PRESENT

Spanish Upper Intermediate

Istanbul Ticaret University (Istanbul Commerce University)

SEPTEMBER 2009 - PRESENT

Robotics

Middle East Technical University

Libraries/APIs

NumPy, REST APIs, Pandas, PySpark, Keras, Jenkins Pipeline, Jira REST API, Bitbucket API, X (formerly Twitter) API, Facebook API, Google API, Google Maps, Bing API, Bing Maps, Google Maps API, Binance API, Bing Maps API

Tools

Abinitio, Git, Atlassian, Jira, Microsoft Excel, Postman, Jenkins, Shell, Docker Compose, GitHub, Confluence, Bitbucket, Toad, Automic (UC4), Kafka Streams, Informatica ETL, Microsoft Power BI, Tableau, Apache NiFi, Guacamole, Apache Airflow, Amazon Elastic MapReduce (EMR), AWS IAM, AWS CLI, AWS Fargate, AWS Glue, Amazon Athena

Languages

Python, SQL, T-SQL (Transact-SQL), Regex, Java, HTML, CSS, JavaScript, C#.NET, C#, Snowflake

Paradigms

ETL, DevOps, Agile, Business Intelligence (BI), OLAP, Automation, Testing, Management, Database Design, REST, Oracle ODI, Serverless Architecture, Automation Engineering

Platforms

Linux, Oracle, Docker, Databricks, Jupyter Notebook, Oracle Data Integrator (ODI), BlackBerry, Android, Apache Kafka, Amazon EC2, Kubernetes, Amazon Web Services (AWS), Alteryx, Azure, AWS Lambda, Airbyte

Storage

PL/SQL, Oracle PL/SQL, Data Pipelines, Databases, Data Integration, Relational Databases, Dynamic SQL, SQL Stored Procedures, Microsoft SQL Server, PostgreSQL, Data Lakes, OLTP, PL/SQL Developer, Teradata, Oracle SQL Developer, Amazon S3 (AWS S3), Redshift, MySQL, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), SQL Server 2016, Amazon DynamoDB

Frameworks

Data Lakehouse, Windows PowerShell, Serverless Framework, Apache Spark, Spark

Other

Development, Data Engineering, Productivity, Data Warehousing, DevOps Engineer, Engineering, Algorithms, Scripting, CSV File Processing, CSV, Data Analysis, Pipelines, Data, Technical Writing, API Integration, Consulting, Data Architecture, Data Management, Data Transformation, Technical Architecture, Monitoring, Writing & Editing, Actuarial, Data Warehouse Design, Entity Relationships, Machine Learning, ETL Tools, Big Data, Data Governance, Process Design, Process Flow Design, Jira Administration, Data Migration, Data Modeling, Regular Expressions, APIs, Data-level Security, Data Visualization, ELT, Data Structures, Big Data Architecture, Industrial Internet of Things (IIoT), Performance Tuning, Metadata, Architecture, PL/SQL Tuning, Deep Learning, Artificial Intelligence (AI), Reinforcement Learning, Data Science, Robotics, Languages, CI/CD Pipelines, Process Flows, Migration, Team Management, Deployment, Mobile App Development, JavaScript Libraries, Maps, HTML5 Geolocation, Computer Science, Programming, Genetic Algorithms, Artificial Neural Networks (ANN), Business Management, Lead Marketing, Human Resources (HR), Finance, TradingView, Mobile Apps, Web Scraping, Informatica, Jira Administrator, Quality Auditing, Quality Control (QC), Dashboards, BI Reports, Reports, QA Automation, Data Quality, Quality Improvement, High Code Quality, Serverless, Data Analytics, Azure Data Factory, Azure Databricks, Azure Data Lake, Azure Data Explorer, Amazon RDS, Data Ethics, Computer Literacy

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring