Orkun Temiz, Developer in Ankara, Turkey
Orkun is available for hire
Hire Orkun

Orkun Temiz

Verified Expert  in Engineering

Data Science Developer Developer

Ankara, Turkey
Toptal Member Since
December 7, 2021

Orkun is a lead database architect and data engineer, creating database models, developing back-end scripts, and analyzing data. He specializes in PL/SQL, Python programming languages, SQL, and Tableau for data retrieval, analysis, and visualization. He is experienced in query optimization, database optimization, and selection on AWS. Orkun is an artificial intelligence enthusiast experienced in developing and implementing parallel programming, machine learning, and deep learning models.


Treehouse Technology Group
Amazon Web Services (AWS), Database Architecture, Python, SQL, PL/SQL...
Business Intelligence (BI), Data Analytics, Information Systems, Tableau...
Database Modeling, SQL, PL/SQL, Oracle E-Business Suite (EBS), Oracle Workflow...




Preferred Environment

PL/SQL, Python, Oracle Database, Tableau, Excel VBA, SQL, Data Visualization, PostgreSQL, Amazon Web Services (AWS), Database Architecture

The most amazing...

...project I've developed is an integration system between Oracle and a 3rd-party master data-management software that involves heavy back-end development.

Work Experience

Lead Database Architect

2022 - PRESENT
Treehouse Technology Group
  • Worked on Salesforce, Salesforce Marketing Cloud, ABC Financial, Sage Integrations, and developments of back-end procedures using SQL and PL/SQL for the reporting layer.
  • Selected and optimized AWS services according to scheduled jobs and queries that run in DBs. Monitored AWS metrics closely and took the related query and configuration optimizations. As a result of this, services are selected cost-efficiently.
  • Created Data Build Tool (dbt) templates of the ETL process to streamline the process and make it efficient and understandable for all junior developers.
Technologies: Amazon Web Services (AWS), Database Architecture, Python, SQL, PL/SQL, SQL Server DBA, PostgreSQL, SOAP, REST APIs, Query Optimization

Senior Business Analyst

2017 - 2021
  • Converted business requirements into IT requirements.
  • Developed an ERM and a database model for the related IT requirements.
  • Developed test procedures and measured performance of PL/SQL procedures. Tracked the metrics of the developed software.
Technologies: Business Intelligence (BI), Data Analytics, Information Systems, Tableau, Database Design, Entity-relationships Model (ERM), Microsoft Power BI, Database Modeling, Agile

Senior Enterprise Applications Developer | Database Developer

2017 - 2021
  • Developed integration scripts to integrate an ERP and 3rd-party software, using REST APIs and Oracle SOA Suite as integration tools. Developed PL/SQL scripts to trigger necessary events on an Oracle database.
  • Built a machine learning model to analyze corporate data using Python.
  • Created data visualization solutions for mid-level managers and C-level executives, using Tableau as a data analysis and reporting tool. Prepared data required for visualization using SQL queries on an Oracle database.
  • Set up applications on Oracle E-Business Suite (EBS) using Oracle Forms, Oracle Workflow, Oracle Personalization, XML Publisher, OA Framework, and so on.
  • Utilized Oracle APIs for DML operations on an Oracle database.
  • Developed database views with SQL to be used in informative user reports.
Technologies: Database Modeling, SQL, PL/SQL, Oracle E-Business Suite (EBS), Oracle Workflow, Oracle Forms, Oracle SOA Suite, SoapUI, Oracle Application Framework (OAF), Oracle Database, XML Publisher, Python, REST APIs, Web Services, Tableau, Tableau Server, PostgreSQL, Microsoft Access, Excel 365, Microsoft SQL Server, SQL Server Management Studio (SSMS)

Oracle ERP | P6 Primavera Integration Project

An integration script between the Oracle ERP and third-party Primavera software.

I was the database and back-end developer of this project. I created the necessary web services and scripts integration using PL/SQL and XML. Also, I built Tableau dashboards to monitor real-time integration status.

Oracle ERP | Master Data Management Software Integration Project

An integration tool that integrates Oracle ERP and third-party master data-management software.

I was the back-end developer of this project. I was responsible for developing the web services required for the integration and PL/SQL scripts, which trigger necessary events in the Oracle database.

Information Verification System Using Unstructured and Structured Evidence from Wikipedia

A Python-based script for claim extraction and verification of given claims, using Wikipedia articles as a source. Given a factual claim involving one or more entities, the script must extract evidence from sentences, table cells, table captions, and list items supporting or refuting the claim. Using this evidence, it labels the claim as "supported," "refuted given the evidence," or "not enough info (NEI)" if there isn't sufficient evidence to either support or refute it.

I developed a machine-learning model for text extraction and deep learning models to verify extracted texts. To obtain the ground truth about the claim and query enhancement, I integrated it with NoSQL and relational databases used via query requests sent from Python.

Tableau Dashboards for Production Planning and Scheduling

A Tableau report set developed for production planning and scheduling in a job shop-based environment.

I worked both as a data analyst and database developer in this project. I created necessary raw data using SQL queries from an Oracle database and developed Tableau reports for data visualization and data storytelling.

Excel VBA Script for Material Requirement Planning

An Excel VBA script for material-requirement planning considers a project's supply and demand information. It then allocates the current supply to the demand by considering the priority and demand's timeline. The script also recommends additional supply if there is a supply shortage. The supply and demand information is retrieved via SQL queries from the Oracle ASCP module.

CUDA-enabled Parallel Programming Implementation of the KNN Algorithm

K-nearest neighbor algorithm is an algorithm widely used for data classification. However, most of this algorithm's implementation runs on the CPU, which causes performance loss in high volume data. The parallel programming implementation of the KNN algorithm utilizes the GPU cores and gives the same output one hundred times faster than CPU implementation.

I developed the CUDA code with a C++ platform using Visual Studio as the IDE.

This project aimed to efficiently classify a high volume of data, and the algorithm is used in a medical document classification script developed in Python.

Fact Extraction and Verification Pipeline for COVID-19 Related User Posts Using a Zero-Shot Learning

A zero-shot fact extraction and verification pipeline to verify user tweets related to COVID-19 against the medical articles. The pipeline composes preprocessing of user posts, claim extraction, document retrieval, evidence selection, and verdict assignment components. The pipeline does not need to see previously labeled posts, unlike numerous supervised studies in the literature. Instead, it uses the zero-shot capabilities of existing models. The system is originally Python-based. However, modules written in C++ and CUDA are imported to Python and integrated into this project. This pipeline uses state-of-the-art NLP models and text analytics methods. For the medical articles, the Kaggle dataset (https://www.kaggle.com/allen-institute-for-ai/CORD-19-research-challenge) and medical archives Pubmed and Medrixv are used.

Trading App for Automatic Crypto Trading

Python-based scripts which do automatic crypto trading in well-known crypto platforms. The scripts check on websites, Twitter, and certain mathematical metrics, and then according to the triggering conditions, the scripts take automatic actions in the crypto platform. The GitHub repository is currently private. However, I made it available to the public in the upcoming days.

A real time Emotion Recognition Model

A deep-learning model trained on the FER-2013 dataset contains 35,877 labeled 48x48 images belonging to seven categories of emotions. Using Python scripting language, an image classification architecture is developed on the TensorFlow library.


A Java object-oriented programming implementation of a chess game. Currently, the application does not have GUI; instead, the chess table and pieces are printed as strings to the application's console.

I am presently adding a suitable GUI.
2019 - 2021

Master's Degree in Information Systems

Middle East Technical University - Turkey, Ankara

2012 - 2017

Bachelor's Degree in Industrial Engineering

Middle East Technical University - Turkey, Ankara


Oracle API, Wikipedia API, Binance API, REST APIs, Twitter API, Google APIs, Requests, Beautiful Soup, PyTorch, TensorFlow


Tableau, SoapUI, Oracle E-Business Suite (EBS), Oracle Workflow, Oracle Forms, Oracle SOA Suite, Microsoft Access, MATLAB, Microsoft Power BI, Visual Studio


Oracle Application Framework (OAF)


Python, SQL, XML, R, C++, Excel VBA, GAMS, Java


Database Design, Business Intelligence (BI), Agile, Dynamic Programming, Parallel Programming, Object-oriented Programming (OOP), Linear Programming


Oracle Database, Oracle, NVIDIA CUDA, Amazon Web Services (AWS)


PL/SQL, Database Modeling, XML Publisher, Relational Databases, JSON, SQL Server Management Studio (SSMS), Microsoft SQL Server, PostgreSQL, NoSQL, Database Architecture, SQL Server DBA


Web Services, Data Visualization, Data Analytics, Data Mining, Information Systems, Excel 365, APIs, Natural Language Processing (NLP), Natural Language Understanding (NLU), Information Retrieval, Hugging Face, Data Preprocessing, Generative Pre-trained Transformers (GPT), Machine Learning, Deep Learning, Text Analytics, Tableau Server, Artificial Intelligence (AI), Data Engineering, HTTP Request Methods, Image Recognition, Optimization, Entity-relationships Model (ERM), SOAP, Query Optimization

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.


Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring