Baris Ergun, Developer in Guildford, United Kingdom
Baris is available for hire
Hire Baris

Baris Ergun

Verified Expert  in Engineering

Big Data Developer

Guildford, United Kingdom

Toptal member since October 26, 2020

Bio

Baris is a qualified and experienced software developer and technical architect with over 18 years of development and technical leadership experience. He has a proven track record in leading development teams and architecting high-performance applications for various IT and software needs. Baris has technical and interpersonal competencies, with over ten years of Java and five years of Scala commercial experience. He has worked for SMBs, enterprises, and Fortune 500 companies.

Portfolio

Hitachi UK Europe
SBT, Amazon Web Services (AWS), Apache Kafka, Amazon Kinesis, Docker...
IQVIA
SBT, Apache Hive, Apache Spark, Scala, Hadoop, Cloudera
Elsevier UK
Amazon Web Services (AWS), Scala, Amazon Elastic MapReduce (EMR), AWS Lambda

Experience

Availability

Part-time

Preferred Environment

Amazon Web Services (AWS), Cloudera, Linux RHEL/CentOS, Docker, Ubuntu Linux, MacOS

The most amazing...

...thing I've done is helping a pharmaceutical client with a complex distributed computing algorithm selected as the most performant and correct implementation.

Work Experience

Senior Scala Developer

2020 - PRESENT
Hitachi UK Europe
  • Developed an application for Hitachi Europe using Akka Streams, AWS Kinesis, and AWS DynamoDB for a product for Energy Industry.
  • Mentored junior developers in developing applications with Scala and Akka Streams.
  • Contributed to biweekly Agile projects by following Agile practices.
  • Developed and application for Hitachi Europe using Akka-Http and Mysql for asset management of a product for Energy Industry.
Technologies: SBT, Amazon Web Services (AWS), Apache Kafka, Amazon Kinesis, Docker, Akka Actors, Akka Streams, Akka HTTP, Scala

Big Data Engineer

2018 - 2020
IQVIA
  • Introduced the fastest, most reliable algorithm out of three people from three different teams working on the same problem for a disease detection project for IQVIA UK for healthcare industry. Used Scala, Apache Spark, Cloudera, and Hadoop.
  • Migrated a complex big data ETL pipeline from Microsoft SQL pipeline to IQVIA USA's Hadoop-based P360G platform for pharmaceutical industry clients in France. Used Scala programming language, Apache Spark, Cloudera, and Hadoop.
  • Followed and implemented Agile practices in our biweekly Agile projects.
Technologies: SBT, Apache Hive, Apache Spark, Scala, Hadoop, Cloudera

Big Data Engineer

2018 - 2018
Elsevier UK
  • Developed the distributed algorithms for detecting illegally published papers in the scientific community for Elsevier publishing industry.
  • Made contributions to code pipeline integration with environmental branching strategy. Used Scala, Apache Spark, AWS, and Amazon EMR.
  • Helped DevOps on improving code integration helper applications developed with Scala.
Technologies: Amazon Web Services (AWS), Scala, Amazon Elastic MapReduce (EMR), AWS Lambda

Big Data Engineer

2017 - 2018
ShopDirect UK
  • Conducted data ingestion of ShopDirect's general business data into a data lake in the AWS environment for a specific project. Used Scala, Apache Spark, AWS, Amazon EMR, and Apache Kafka.
  • Delivered a data pipeline project for price matching data analytics for four different brands of ShopDirect using Scala as a programming language and AWS Lambda, EMR (Apache Spark), Athena, and Kinesis, which also included a GDPR layer.
  • Contributed to the Ansible-based infrastructure as code (IaC) project for deploying data pipelines to the AWS cloud.
Technologies: Amazon Kinesis, Scala, Apache Spark, Amazon Web Services (AWS)

Staff Software Engineer

2015 - 2016
Seagate UK
  • Developed non-blocking I/O threading routines to speed up tests triggered by factory quality and the performance monitoring application.
  • Developed and maintained integration Bash scripts for Seagate's storage box products on Linux platforms for computer hardware industry.
  • Contributed to two sprints per week as a part of the black team working on Storage Box Systems, Seagate's test solutions.
Technologies: C++, Perl, C, Scala

Software Architect

2010 - 2015
Telenity, Inc.
  • Developed and led an application called SmartTrail, a location-based web application to visually track clients' resources on a map. Any smartphone or dummy phone could be used to track the entities with geofencing capabilities.
  • Contributed to the opening of the source project, DataNucleus geospatial module. Integrated and revived the geospatial module for three different databases, PostgreSQL/PostGIS, MySQL, and Oracle.
  • Used Scala and Apache Spark to generate real-time reports of critical locations to our Telenity branch in India. Enabled them to run location-targeted promotions in real time.
  • Took part in core software architecting activities for the company's core platforms united in one product called Canvas Platform.
  • Managed a team of four to five engineers on all projects as their lead developer and mentor.
  • Gained experience in Scala, Java, and Linux, along with application benchmarking skills, Bash skills, and Spark with Hadoop expertise to analyze real-time customer location data.
  • Developed a USSD application for the dummy phone users to be able to chat over USSD.
  • All the products were developed for GSM Telecommunication industry for big GSM mobile operators.
Technologies: RESTEasy, Akka HTTP, Apache Camel, MySQL, Oracle RDBMS, PostGIS, DataNucleus, AngularJS, PostgreSQL, OpenMQ, ActiveMQ, Ubuntu Linux, Linux RHEL/CentOS, Linux, Scala

Lead Software Developer

2007 - 2010
AirTies Wireless Networks
  • Built a cross-platform application called Home Network Manager with C++ and Qt to manage the wireless or wired home network devices and auto troubleshooting capabilities.
  • Built an automated test framework with Java. Managed to increase the firmware release capacity from one to two or three firmware releases per month.
  • Saved a significant amount of time for the test team to focus more on manual functional and performance tests by building the automated test framework.
  • Won funding from the government by successfully implementing a reusable test automation framework for AirTies, that could be used for various products.
  • Led a team of four to five engineers as a hands-on lead developer.
  • All the products were developed for Wireless Equipment manufacturing industry.
Technologies: Qt, C++

Founder | Developer

2002 - 2007
Balya Software, Ltd.
  • Designed and implemented a core Windows service based on C# and unmanaged C++ ATL COM objects that recorded videos from various network camera products under a single platform by transcoding images from JPEG and MJPEG to MPEG-4 and stored them.
  • Developed motion detection software based on FFT-based algorithms.
  • Built a web application to remotely manage and view network camera recording software using ASP.NET.
  • Developed a .NET application to locally manage and view network camera recording software using C#.
  • Hired new employees and undergraduates from the university to help me develop and test the application mentioned in the previous bullet.
  • Acted as a founder of this incubation startup and served as a developer on different projects.
  • Received four-year funding from the Turkish government and an office in the most famous incubation center in Istanbul at the ITU university after working hard to prepare the project and proof of concept (POC) of my product.
  • Sold the product and provided support for the application to dozens of medium-sized companies and many more small businesses.
  • This project was for Security Surveillance Systems industry.
Technologies: ASP.NET, ATL, C++, C#.NET

Network Engineer

1997 - 1999
Siemens Turkey
  • Deployed, provisioned, and maintained the first cable modem system in Turkey over HFC CATV networks in the Izmit CATV network.
  • Attended various courses on network technologies and TCP and UDP network protocols.
  • Helped select the most appropriate CATV modem technologies for Siemens HFC CATV networks.
Technologies: HP OpenView, Solaris, IP Networks, TCP/IP

Disease Detection for IQVIA

In recent years, this is the most exciting project I've worked on.

I worked on the disease detection project to analyze and report candidates most likely to be affected by a specific disease or condition. I introduced the fastest and most reliable algorithm among three different people in three other teams working on the same problem. I achieved this by thinking out of the box and implementing the required solution, abiding functional programming concepts such as establishing an algorithm in line with associativity and identity laws.

The project was handed over to me by two permanent employees who were leaving the company. In nine months, I established the complete code pipeline and solved the most complex parts of the project, which were required to bring the performance up. Measured according to the same reference data, the overall solution I have provided was at least 20 times faster than any other solution.

I heavily used functional programming concepts, algorithmic thinking, and comprehensive testing strategies for this project.

Legacy Data Pipeline Migration to Apache Spark and Hadoop-based Generic Framework

I migrated a complex big data ETL pipeline from the MSSQL pipeline to IQVIA's Hadoop-based P360G platform for pharmaceutical clients in France. This was a project where basic extracts were implemented with Impala queries and complex extracts handled with Spark jobs. I delivered and productionized the first two phases (R1 and R2) of the project successfully. There was no up-to-date documentation about the legacy project, and since every complex logic was handled with SQL, the solutions that needed migration were complex.

I overcame all the difficulties by:
- Reverse engineering the legacy code both by interpreting it and pulling features out of the resulting User Acceptance Test data with an input-output analysis.
- Keeping up-to-date documentation of new Spark and Impala jobs.
- Communicating with all clients frequently and efficiently.
- Using the property, unit, and integration tests, which were implemented with random data generators and data generators with imposed conditions.
- Revolutionizing complex extracts with Spark jobs utilizing both dataset SQLs and more with Scala algorithms on each partition.
- Conducting a meticulous UAT discrepancy analysis.

Data Pipeline for a Big Retail Client

Data ingestion of ShopDirect's general business data into a data lake in the AWS environment. For this project, the major achievement was to drive whole data pipeline integration end to end on different data storage zones by overcoming the challenge of realizing the need for snapshotting the mutable database sources that were not clear in the requirements. Another challenge I had was joining big data with a smaller but still large data source by custom partitioning and grouping techniques in Apache Spark.
1993 - 1998

Bachelor's Degree in Electrical Engineering

Yildiz Technical University - Istanbul

FEBRUARY 2017 - PRESENT

Team Mentoring

Coursera, Inc.

JUNE 2015 - PRESENT

Oracle Certified Professional

Oracle

FEBRUARY 2015 - PRESENT

Oracle Certified Associate

Oracle

Libraries/APIs

Akka Streams, RESTEasy, ATL

Tools

Cloudera, SBT, DataNucleus, Kafka Streams, ActiveMQ, Amazon Elastic MapReduce (EMR), Apache Impala, Impala

Languages

Scala, Java, JavaScript, C, Perl, C++, C#.NET

Frameworks

Apache Spark, Hadoop, AngularJS, Apache Camel, Qt, ASP.NET

Paradigms

Agile, Web Architecture, Scrum

Platforms

Ubuntu Linux, Linux, MacOS, Apache Kafka, Linux RHEL/CentOS, Amazon Web Services (AWS), AWS Lambda, Docker, Solaris

Storage

PostgreSQL, Oracle RDBMS, MySQL, Apache Hive, PostGIS

Other

Engineering, Team Mentoring, Big Data, Big Data Architecture, Architecture, Waterfall Methodology, Akka HTTP, Akka Actors, SAP Cross-Application Time Sheet (CATS), Amazon Kinesis, OpenMQ, TCP/IP, IP Networks, HP OpenView

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring