Selahattin Hancer, Developer in Austin, TX, United States
Selahattin is available for hire
Hire Selahattin

Selahattin Hancer

Verified Expert  in Engineering

Data Engineer and Developer

Location
Austin, TX, United States
Toptal Member Since
December 8, 2022

Selahattin is a data engineer, database administrator, and software developer with over 15 years of industry experience. Throughout his career, he has managed mission-critical databases and built data platforms. Selahattin is passionate about the trending technology stacks and using them to deliver products that help businesses.

Portfolio

Wenzel Spine
Docker, Kubernetes, Python, SQL Server DBA, Shell Scripting, REST APIs, SQL...
Ercot
Oracle, Linux, Shell Scripting, Upgrades, SQL Server DBA, SQL Performance...
Statera Spina
Apache Airflow, Data Build Tool (dbt), Data Pipelines, Amazon S3 (AWS S3)...

Experience

Availability

Full-time

Preferred Environment

Data Build Tool (dbt), PostgreSQL, SQL, Python, Apache Airflow, Snowflake, Shell Scripting, Amazon Web Services (AWS), Data Pipelines, Database Administration (DBA)

The most amazing...

...projects I've worked on involved building data and DevOps pipelines on AWS, managing large databases, and writing tools for monitoring and patching.

Work Experience

Data Engineer

2021 - PRESENT
Wenzel Spine
  • Created and maintained ETL processes and ELT data pipelines from various sources.
  • Utilized Python decorators, the data build tool (DBT) app, and APIs and deployed them to the Docker host and Kubernetes.
  • Developed REST APIs and deployed them using Kubernetes.
  • Designed interactive and dynamic dashboards using Tableau to visualize complex data sets and KPIs effectively, enabling stakeholders to identify trends and insights quickly.
  • Generated advanced Tableau dashboards with quick/context/global filters, parameters, Sets, LODs, advanced maps, and calculated fields.
Technologies: Docker, Kubernetes, Python, SQL Server DBA, Shell Scripting, REST APIs, SQL, ETL, Apache Airflow, Data Build Tool (dbt), Airbyte, MySQL, Tableau, ETL Tools, Microsoft SQL Server, SQL DML, SQL Performance, Performance Tuning, Data Queries, DevOps, CI/CD Pipelines, Bash, Windows PowerShell, Data Engineering

Database Administrator

2020 - 2021
Ercot
  • Upgraded and managed an extensive data warehouse database.
  • Provided SQL tuning support to a developer during the upgrade process.
  • Conducted a development and test database refresh and wrote a shell script to automate the refresh process.
  • Performed periodic GI patching for over 200 databases during the patching cycle.
  • Monitored databases and eliminated downtime by creating Linux and Unix shell and PL/SQL scripts.
  • Planned and completed disaster recovery and switchover testing for databases.
  • Used real application testing before critical database changes, including migration, upgrading, major patches, and deployment.
  • Developed a database and system monitoring tool using PowerShell and Transact-SQL.
Technologies: Oracle, Linux, Shell Scripting, Upgrades, SQL Server DBA, SQL Performance, Performance Tuning, Data Queries, Bash

Data Engineer

2018 - 2019
Statera Spina
  • Built and orchestrated data pipelines using Apache Airflow.
  • Introduced DBT to the ETL system to transform data and built DBT modules.
  • Developed and maintained data pipelines that were ingesting data from various sources like Amazon S3 buckets, third-party REST APIs, SQL, and NoSQL databases.
Technologies: Apache Airflow, Data Build Tool (dbt), Data Pipelines, Amazon S3 (AWS S3), Docker, REST APIs, SQL, ETL, Amazon Web Services (AWS), Python, Fivetran, T-SQL (Transact-SQL), SQL DML, SQL Performance, Performance Tuning, Data Queries, Data Engineering, PyCharm

Data Engineer

2017 - 2018
Kredya
  • Utilized AWS Lambda, Amazon DynamoDB, Snowflake, Amazon RDS, Amazon S3, pandas, and Python.
  • Built a DevOps pipeline on AWS using GitHub, Jenkins, Ansible, Docker, and Kubernetes.
  • Designed and implemented data models for a data warehouse and data marts.
  • Developed, maintained, and monitored ETL processes and ELT data pipelines.
  • Provided SQL tuning support for the production database.
  • Migrated our data warehouse database from Postgres to the Snowflake database.
Technologies: Snowflake, Python, AWS Lambda, Amazon S3 (AWS S3), Docker, Kubernetes, ETL, SQL, PostgreSQL, SQL Performance

Database Administrator

2011 - 2015
TEB
  • Supported and maintained over 200 Oracle databases in different environments, including production, test, development, and Oracle Active Data Guard.
  • Implemented the migration of databases from Linux to Unix and vice versa.
  • Cloned, refreshed, and restored Oracle databases for development and testing.
  • Created multiple Linux and Unix shells and PL/SQL scripts to monitor Oracle databases and eliminate downtime proactively.
  • Installed and configured integrated and classic GoldenGate for data replication.
  • Configured, installed, and implemented PostgreSQL, SQL Server, and Cassandra databases.
  • Set up PostgreSQL for high availability and replication with a hot standby.
  • Planned and implemented Oracle and PostgreSQL backup strategies.
  • Conducted various implementation tasks concerning alert monitoring for CPU usage, disk space, contention, and high events.
Technologies: Oracle, PostgreSQL, SQL Server DBA, Linux, Oracle GoldenGate, RMAN, ASM, Shell Scripting, SQL Performance, Performance Tuning, Data Queries, Bash

Data Engineer

2008 - 2011
TEB
  • Wrote advanced queries, PL/SQL packages, functions, triggers, SQL scripts, and views.
  • Developed ETL scripts to load data from multiple sources into the data warehouse and analyzed, cleaned, transformed, and loaded data using ODI.
  • Utilized erwin for the warehouse's logical and physical database modeling.
Technologies: Python, PL/SQL, SQL, ETL, Data Engineering

Software Developer

2005 - 2008
Obase
  • Created internal tools using Python, SQL, and Oracle.
  • Analyzed, designed, and developed client applications using Oracle Forms and Report.
  • Used Python to extract data from comma-separated values files, JSON, and Microsoft Excel.
Technologies: Python, APIs, Oracle

Modern Data Platform and CI/CD

Built a modern data platform and DevOps pipeline from scratch for a client using popular tech stacks. While working on data storage and processing, I used Snowflake for cloud data warehousing and Amazon S3 to build the data lake for data science.

Next, I covered data ingestion using custom Python code, pandas that ingested some of the data from various sources, Airbyte and Fivetran for data source implementation, and Apache Airflow for data orchestration. To conduct data transformation and modeling, I used DBT and transformed data that was loaded into the warehouse.

The DevOps pipeline was built on AWS, using Ansible, Jenkins, Docker, and Kubernetes, and ran on Amazon EC2 instances. Finally, I deployed REST API on Kubernetes and Dockerized the data build tool.

Languages

SQL, Python, Bash, T-SQL (Transact-SQL), SQL DML, JavaScript, Snowflake

Frameworks

ASM, Windows PowerShell

Libraries/APIs

REST APIs, Pandas

Tools

RMAN, Git, Tableau, PyCharm, Apache Airflow, Oracle GoldenGate, Jenkins, Ansible, GitHub

Paradigms

ETL, DevOps

Platforms

Linux, Airbyte, Docker, Oracle, AWS Lambda, Amazon Web Services (AWS), Kubernetes, Amazon EC2

Storage

PostgreSQL, Database Management, PL/SQL, SQL Server DBA, Data Pipelines, Database Administration (DBA), SQL Performance, Microsoft SQL Server, Amazon S3 (AWS S3), MySQL, Relational Databases

Other

Data Build Tool (dbt), Data Warehousing, Shell Scripting, ELT, Upgrades, Data Engineering, CI/CD Pipelines, Performance Tuning, Data Queries, Data Visualization, Visualization, Software Deployment, System Design, APIs, Cloud, ETL Tools, SSH, Fivetran

1999 - 2004

Bachelor's Degree in Computer Engineering

Sakarya University - Serdivan, Turkey

OCTOBER 2020 - PRESENT

Oracle Certified Professional (Cloud)

Oracle

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring