Burak Uyar, Developer in Istanbul, Turkey
Burak is available for hire
Hire Burak

Burak Uyar

Verified Expert  in Engineering

Bio

Burak is an experienced full-stack data developer with vast experience as a data team manager, senior data engineer, senior data analyst, and data scientist. He is also well-versed in business stakeholder communication, requirements analysis, task prioritization and allocation, and cost and budget optimization related to technical tools and services.

Portfolio

Bright on Analytics Ltd
Tableau, SQL, PostgreSQL, ClickHouse, DigitalOcean, Linux, Tableau Desktop...
Toptal
Python, SQL, Google Cloud Platform (GCP), Apache Airflow, Google Cloud Composer...
Scoutium
Python, Pandas, Amazon Web Services (AWS), NoSQL, SQL, Amazon Athena, AWS Glue...

Experience

  • Python - 12 years
  • SQL - 10 years
  • Data Pipelines - 9 years
  • APIs - 9 years
  • Databases - 9 years
  • Tableau - 6 years
  • Amazon Web Services (AWS) - 4 years
  • Google Cloud Platform (GCP) - 3 years

Availability

Part-time

Preferred Environment

Python, Google Cloud Platform (GCP), Amazon Web Services (AWS), SQL, CI/CD Pipelines, Dashboards, ETL, Git, Data Pipelines, ClickHouse

The most amazing...

...projects I've contributed to involved designing, building, and transforming data infrastructures into scalable and easy-to-maintain systems.

Work Experience

Tableau Developer

2024 - 2025
Bright on Analytics Ltd
  • Developed a BI solution for visualizing the sales-related data of the client's competitors, enabling the client to gain insights and achieve a higher ROI.
  • Collected the requirements and arranged them according to their priorities and dependencies. At that point, I noticed a potential gap between the current and ideal tech stack and prepared a demo to present to the client.
  • Created and presented solution options to the client, collaborated to finalize the optimal approach, and split the project into two phases for Agile delivery.
  • Delivered the initial version in the first phase of the Tableau dashboard on the Tableau cloud to confirm that the interactive dashboards would enable the client to improve their ROI.
  • Integrated ClickHouse as an OLAP between PostgreSQL and Tableau in the second phase so that the project is more aligned with the best practices and all data becomes queryable in very short processing times.
Technologies: Tableau, SQL, PostgreSQL, ClickHouse, DigitalOcean, Linux, Tableau Desktop, IPython Notebook, Jupyter Notebook, Dashboard Design, OLAP, Tableau Embedded Analytics, Business Intelligence (BI), YAML, Data Lakes, Data Analysis, Data Build Tool (dbt), Star Schema, Relational Databases

Senior Data Engineer

2022 - 2024
Toptal
  • Designed and implemented data pipelines from both SQL and NoSQL sources, as well as GCS and GBQ, using Airflow (Cloud Composer) and Luigi.
  • Developed a microservice for improved data quality checks. The app aimed to prevent database-breaking changes from the source microservices to the data warehouse.
  • Contributed to developing a framework for Google Cloud Composer. The main aim of the framework was to define the constraints of the data ingestion and loading processes for efficient development and maintenance.
  • Collaborated with multi-part business stakeholders for multiple critical business projects. These projects included financial and other sensitive data.
Technologies: Python, SQL, Google Cloud Platform (GCP), Apache Airflow, Google Cloud Composer, Luigi, Flask, APIs, Snowflake, Databases, Data Quality, Data Governance, Unit Testing, ETL, Git, GitHub, Jira, Confluence, Data Engineering, Data Warehousing, Google BigQuery, API Integration, SFTP, Web Dashboards, Data Modeling, Postman, Docker Compose, Web Scraping, NoSQL, Pandas, Data Pipelines, Programming, Algorithms, Data Structures, CI/CD Pipelines, Dashboard Design, excel formulas, JSON, OpenAI API, XML, XML Schema, ETL Tools, HTML, Tableau Desktop, Scraping, BigQuery, Looker, Business Analysis, Data Architecture, Data Visualization, Artificial Intelligence (AI), Apache Kafka, Data, Data Strategy, Debugging, Reporting, Terraform, Troubleshooting, Google Sheets, Large Data Sets, Parquet, Dask, Amazon EC2, Database Optimization, Database Design, PostgreSQL, NumPy, DevOps, PySpark, Infrastructure as Code (IaC), YAML, Data Scraping, Data Lakes, Data Analysis, Google Cloud Functions, Fivetran, Docker, Kubernetes, Star Schema, Relational Databases

Chief Information Officer

2019 - 2022
Scoutium
  • Led R&D processes with AI approaches on 1st-party data, mathematically modeling the proprietary data and building data products, including crowdsourcing design and data product delivery.
  • Implemented BI processes and designed a self-service data analytics approach and infrastructure.
  • Designed and implemented data integration processes containing data from internal sources, including relational and NoSQL sources. Data was additionally ingested from multiple APIs and public websites using scraping.
Technologies: Python, Pandas, Amazon Web Services (AWS), NoSQL, SQL, Amazon Athena, AWS Glue, Cron, Kubernetes, Machine Learning, APIs, Web Scraping, Data Analytics, Data Visualization, Data Products, Git, GitHub, ETL, Jira, Data Engineering, Tableau, Business Intelligence (BI), Dashboards, Reports, Data Warehousing, API Integration, SFTP, Web Dashboards, Data Modeling, Postman, Docker Compose, Databases, Data Pipelines, Programming, Algorithms, Data Structures, CI/CD Pipelines, Dashboard Design, excel formulas, Excel 365, JSON, XML, XML Schema, ETL Tools, HTML, HTML5, Tableau Desktop, Tableau Server, Tableau API, Looker Studio, Scraping, Redshift, AWS Lambda, Amazon Redshift, Amazon S3 (AWS S3), MongoDB, Business Analysis, Data Architecture, Data, Data Strategy, Debugging, Reporting, Troubleshooting, Google Sheets, Large Data Sets, Parquet, Amazon EC2, Database Optimization, Database Design, PostgreSQL, NumPy, DevOps, Tableau Embedded Analytics, Infrastructure as Code (IaC), YAML, Data Scraping, Data Lakes, Data Analysis, Firebase, User Behavioral Analytics (UBA), Docker, Star Schema, Relational Databases, Amazon QuickSight

Data Engineering Manager

2016 - 2019
GroupM
  • Designed, implemented, and analyzed custom attribution models for various clients, including customer journeys in online and offline touchpoints.
  • Provided automation solutions for reporting processes using Python, SQL, Tableau, and Datorama.
  • Leveraged data providers' APIs for data integration and data model creation.
  • Delivered technical consultancy for creative solutions based on multiplatform advertising projects.
Technologies: Python, Pandas, Machine Learning, Excel VBA, APIs, Web Scraping, Cloud Storage, BigQuery, SQL, Tableau, Datorama, Advertising Technology (Adtech), Algorithms, Dashboards, Data Engineering, ETL, Business Intelligence (BI), Google Analytics, Reports, Data Warehousing, Google BigQuery, API Integration, SFTP, Web Dashboards, Data Modeling, Postman, Databases, Programming, Data Structures, Dashboard Design, excel formulas, Microsoft Power BI, Excel 365, JSON, XML, XML Schema, ETL Tools, HTML, HTML5, JavaScript, Tableau Desktop, Tableau Server, Tableau API, Looker Studio, Scraping, Business Analysis, Data Architecture, Data Visualization, Data, Data Strategy, Excel Macros, Microsoft Excel, Debugging, Reporting, Troubleshooting, Large Data Sets, Database Design, NumPy, DevOps, Tableau Embedded Analytics, Data Analysis, User Behavioral Analytics (UBA), Relational Databases, Research Analysis

Music Information Retrieval (MIR) Researcher

2013 - 2016
CompMusic
  • Defined computational research problems on Turkish makam music corpus publicly available at CompMusic.
  • Created a desktop application for self-tutored rhythm ("usul") training.
  • Published multiple papers for international conferences available in Google Scholar.
Technologies: Python, Machine Learning, Digital Signal Processing, Audio Processing, Music Publishing, Data Engineering, Data Collection, Data Analytics, Data Science, Music Information Retrieval (MIR), Pandas, Programming, Algorithms, Data Structures, excel formulas, JSON, XML, XML Schema, JavaScript, Data Visualization, Data, Data Strategy, Excel Macros, Microsoft Excel, Debugging, Troubleshooting, NumPy, Data Analysis, Research Analysis

Experience

Microservice for Improved Data Quality Checks

This project aimed to tackle the challenge of changing databases, which might affect a data warehouse that periodically loads data from them. As a solution, we developed a microservice that integrates into the configuration items of those source applications and checks the database structure in their test environments. Those structures are sent to the microservice and compared with the expected schema structure in the data warehouse. If any part of the new schema in the test environment is unexpected, the microservice makes the check fail, blocking the pull request until that conflict is fixed. The project has been live in production since it was developed and has saved many breaking changes before deployment.

End-to-end DWH and BI Solution

I designed and implemented a DWH in the cloud and BI solutions on top of it to enable self-service analytics. I was the head of data in the company and managed 2-4 team members. My role was to receive the brief, prepare the solution, and ensure that our Data team delivered the results.

The company had different data sources, all of which needed to be used for different purposes, including self-service analytics, training, and testing ML models, and feeding the application database back.

The sources included PostgreSQL, MongoDB, APIs, web-scraping data, spreadsheets, and flat files.

The solution included having the brief from the internal teams, designing the structure, preparing the implementation phases, preparing the individual tasks, and resource allocation from our data team.

The solution mainly involves utilizing Python, SQL, NoSQL, Crontab, AWS S3, AWS Glue, AWS Athena, AWS IAM, Tableau, and Tableau Bridge.

End-to-end Business Intelligence Solution

An end-to-end business intelligence solution for self-service analytics for the client.

The client had their data on a PostgreSQL database and wanted to use Tableau for self-service analytics to improve the efficiency of their investments, hence ROI.

The project was split into two phases. The first part was to create a PoC without using an OLAP, use smaller data, and deliver the first version of the Tableau Cloud. The second part was to improve the solution using a proper OLAP, data pipeline, and ETL setup.

The first part is delivered quickly using PostgreSQL directly as the source, Tableau extracts, and Tableau Cloud.

For the second part, ClickHouse was selected as the OLAP. The implementation included using the MaterializedPostgreSQL database engine and replicating the relevant source tables from Postgres to ClickHouse. On top of the raw replicated tables, materialized views were built to create analytics layers. That approach allowed easy-to-manage infrastructure and lower maintenance effort for the client at later steps. With that implementation, near real-time data flow was provided into ClickHouse, hence into Tableau Cloud, via a Tableau Bridge instance running on Linux.

Web-scraping Data Pipeline on Cloud Services

A containerized web-scraping service was designed from scratch and integrated into GCP Cloud Composer (Airflow). The process runs on the Kubernetes cluster and is fault-tolerant, so if it fails at any point, it can find where it was left and continue to process from that point. GCS, BigQuery, Cloud Composer, Cloud Vision, and Secret Manager are some of the services used in this solution.

Education

2013 - 2016

Master's Degree in Audio Technologies and Sound Computing

Bahcesehir University - Istanbul, Turkey

2006 - 2011

Bachelor's Degree in Computer Engineering

Bogazici University - Istanbul, Turkey

Skills

Libraries/APIs

Pandas, NumPy, PySpark, OpenAI API, Tableau API, Dask, Luigi

Tools

Apache Airflow, BigQuery, Tableau, Git, Tableau Desktop, Microsoft Excel, Google Sheets, AWS Glue, Postman, Microsoft Power BI, Tableau Embedded Analytics, Amazon QuickSight, GitHub, Google Cloud Composer, Amazon Athena, Cron, Datorama, Docker Compose, Jira, Confluence, Google Analytics, AWS IAM, Looker, Terraform, Trello, IPython Notebook, Google Compute Engine (GCE)

Languages

Python, SQL, XML, YAML, Snowflake, HTML, HTML5, JavaScript, Excel VBA, Python 3, Java

Frameworks

Spark, Apache Spark, Flask, Selenium

Paradigms

ETL, Business Intelligence (BI), Database Design, User Behavioral Analytics (UBA), Unit Testing, Agile Project Management, Agile, DevOps, OLAP

Platforms

Docker, Google Cloud Platform (GCP), Amazon Web Services (AWS), Kubernetes, AWS Lambda, Amazon EC2, Azure, Databricks, Firebase, MacOS, Linux, Apache Kafka, DigitalOcean, Jupyter Notebook

Storage

Data Pipelines, Databases, PostgreSQL, Amazon S3 (AWS S3), JSON, XML Schema, Data Lakes, Relational Databases, ClickHouse, Redshift, MongoDB, NoSQL, Google Cloud Storage, Google Cloud

Other

Data Structures, Data Engineering, APIs, Web Scraping, Data Visualization, Dashboards, Data Warehousing, Google BigQuery, API Integration, SFTP, Web Dashboards, Data Modeling, Dashboard Design, ETL Tools, Scraping, Business Analysis, Data Architecture, Data, Data Strategy, Excel Macros, Debugging, Reporting, Troubleshooting, Large Data Sets, Parquet, Database Optimization, Data Scraping, Data Analysis, Star Schema, Research Analysis, excel formulas, Excel 365, Tableau Server, Looker Studio, Amazon Redshift, Artificial Intelligence (AI), Azure Databricks, Google Cloud Functions, Fivetran, CI/CD Pipelines, Algorithms, Time Complexity Analysis, Software Engineering, Programming, Web Technologies, Signal Processing, Digital Signal Processing, Data Analytics, Data Collection, Audio Processing, Data Science, Machine Learning, Data Quality, Data Governance, Data Products, Cloud Storage, Advertising Technology (Adtech), Music Publishing, GitHub Actions, Metadata, Computer Engineering, Music Information Retrieval (MIR), Reports, Team Management, Stakeholder Management, IT Project Management, Data Warehouse Design, Infrastructure as Code (IaC), GSM, Google Artifact Registry, Data Build Tool (dbt)

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring