Piotr Pietruszka, Developer in Kraków, Poland
Piotr is available for hire
Hire Piotr

Piotr Pietruszka

Verified Expert  in Engineering

Database Developer

Location
Kraków, Poland
Toptal Member Since
February 5, 2021

Piotr is a database developer with 12 years of experience in business intelligence projects as a back- and front-end developer. He designed and developed SQL ETL jobs to migrate a financial system from Oracle to SAP at the European Space Agency. Piotr excels in Oracle databases, SQL, ETL processes development, and the creation of high-quality reports. He has been working on big data projects and building data pipelines using Apache Spark technology for over three years.

Portfolio

Teamstand
Amazon Web Services (AWS), Data Modeling
Brown Brothers Harriman
Big Data, Hadoop, Spark, SQL, Java, Spring, Solr, Cloudera, OpenShift, Docker...
Brown Brothers Harriman
IBM Cognos, SQL, Data Visualization

Experience

Availability

Part-time

Preferred Environment

Ab Initio, Talend, IBM Cognos, Qlik Sense, Oracle, Amazon Web Services (AWS), Spark SQL

The most amazing...

...project I've worked on was the LivingWallet application to track private investments and produce useful insight on investor-asset portfolios.

Work Experience

Configuration Specialist

2021 - 2022
Teamstand
  • Configured multiple database tenants to serve customer needs in particular business areas (project management, task collaboration).
  • Communicated with Product Development Leads to turn business requirements into working application configuration.
  • Supported SQL code deployments into Acceptance and Prod environments.
Technologies: Amazon Web Services (AWS), Data Modeling

Data Engineer

2018 - 2020
Brown Brothers Harriman
  • Developed a self-service data lake platform to enable multiple company departments to onboard and analyze their data. The solution was based on Hadoop and Apache Spark running in a Cloudera On-premises platform.
  • Assisted with POC projects to evaluate and compare Big Data tools (Scoop, Kafka Connect, Spark) in terms of data ingestion capabilities. Loaded over 120GB of corporate data into the Apache Ignite database and ran complex SQL queries to test the performance.
  • Deployed over 20 Java and Python applications into Redhat OpenShift cluster. Developed a generic CI/CD pipeline using Jenkins and an OpenShift plugin to onboard new applications.
Technologies: Big Data, Hadoop, Spark, SQL, Java, Spring, Solr, Cloudera, OpenShift, Docker, Data Pipelines, Data Architecture, Snowflake

BI Developer

2016 - 2018
Brown Brothers Harriman
  • Developed financial reports for the Custody and Foreign Exchange departments. Transaction and balance statements were sent to the clients daily. The transaction data volume was over 1,5 billion records which was challenging from a performance perspective.
  • Cooperated with the Infrastructure team to deliver reports to production systems while maintaining the company's SDLC best practices.
  • Produced Qlik Sense dashboards and data presentations for high-level management providing insight into company P&L analysis.
Technologies: IBM Cognos, SQL, Data Visualization

Software Engineer

2014 - 2015
EPAM Systems
  • Developed PL/SQL ETL processes to load data mart for compliance purposes at a financial institution. Worked in cross-functional Scrum team to provide implementation, test scripts, documentation and deployment scripts.
  • Participated in implementations at the client site to deploy new releases to production. This included assistance during UAT testing and providing test evidence and required documentation before each release.
  • Created high-level interactive dashboards for top management in QlikView 11. The analytic platform was used to verify corporate data about over 67,000 employees in 50 countries to verify any breaches in pre-defined compliance policies.
Technologies: QlikView 11, Oracle PL/SQL, ETL, Data Engineering

Senior Software Engineer

2010 - 2014
Informatica
  • Participated in a data warehouse full-lifecycle project to build a BI platform for a large European chemical company. This included data warehouse design targeted to meet reporting requirements in the sales, stock and forecasting areas.
  • Migrated legacy ETL scripts into IBM DataStage ETL tool. Several SQL scripts populating transaction tables needed to be migrated into ETL tool jobs. During this process an additional auditing and monitoring layer was added.
  • Produced high-quality reports for the sales department in a micro-strategy reporting tool. The sales fact table contained over 400 million records; therefore, some of the self-service analysis was based on OLAP cube, which was refreshed daily.
  • Developed Ab Initio jobs to decode/encode POS transaction files from delimited to XML format according to client specification. The ETL job was processing over 10,000 files daily from multiple stores and applied complex business and data cleaning rules.
Technologies: MicroStrategy, IBM InfoSphere (DataStage), Ab Initio, ETL, Data Engineering, Star Schema

Senior Software Engineer

2007 - 2009
Accenture
  • Assisted in maintenance for the energy company's data warehouse, monitored execution of Oracle Warehouse Builder ETL jobs, and produced data quality scripts to verify DWH integrity. Managed to apply a performance tuning fix to reduce load time from four hours to 1.5 hours.
  • Participated in a SOA integration project for CRM and provisioning systems for one of the largest Internet, mobile, and TV providers in the Netherlands. Thanks to this implementation, the client could significantly reduce the delivery time of services.
  • Designed and developed SQL ETL jobs to migrate a financial system from Oracle to SAP at the European Space Agency. The project involved the design of complex ETL rules and close cooperation with SAP experts to meet functional requirements.
Technologies: SQL, Service-oriented Architecture (SOA), Oracle Warehouse Builder (OWB), ETL, Data Engineering

LivingWallet

https://github.com/pete-parsley/livingwallet
LivingWallet is a private investment and finance tracking application. It allows the user to manually enter all investment/spending transactions he/she makes. Optionally, it is possible to load multiple transactions from banking systems and stock markets using batch jobs. All pricing information is automatically computed based on transaction date currency exchange rates. Daily balances are calculated using a Spark job reading data from a time-series database. The application provides a web UI based on the Thymeleaf template engine. It also enables interactive data analysis using Qlik Sense Tool.

DataSandbox

Datasandbox.pl is a technical blog that I write to share knowledge as well as document and better digest whatever I learn in the data engineering field. Here, I explore new tools and describe projects I am currently working on in my free time.

Crypto Market Web Scraping App

Created a Java application to collect cryptocurrency data from web pages and REST API's to load into HDFS. The application was scheduled to run daily and ingest data for over 1,500 cryptocurrencies. The data was then loaded into a Hive database and analyzed with SQL queries. The data visualization layer was built using Apache Zeppelin Notebook.

Languages

SQL, Java, Java 8, Snowflake

Paradigms

ETL, Database Development, Database Design, Service-oriented Architecture (SOA)

Storage

Relational Databases, Databases, Data Pipelines, Oracle PL/SQL, InfluxDB, HDFS, Apache Hive

Other

BI Reports, Data Engineering, Big Data, MicroStrategy, Software Engineering, Data Visualization, Data Architecture, Data Modeling, Star Schema, Data Analytics, Springbot

Frameworks

Spark, Hadoop, Spring

Tools

Qlik Sense, IBM Cognos, Ab Initio, Talend ETL, IBM InfoSphere (DataStage), Oracle Warehouse Builder (OWB), Spark SQL, Grafana, Solr, Cloudera

Platforms

Oracle, Docker, Talend, Red Hat OpenShift, QlikView 11, OpenShift, Amazon Web Services (AWS)

Libraries/APIs

REST APIs

2006 - 2006

Exchange Program (Thesis Project On Artificial Intelligence) in Computer Engineering

Universite de Technologie de Belfort-Montbeliard - Belfort, France

2001 - 2006

Master's Degree in Information Systems Engineering

AGH University of Science and Technology - Krakow, Poland

FEBRUARY 2019 - PRESENT

Big Data Certification

Edureka

JANUARY 2015 - PRESENT

Oracle SQL Expert

Oracle

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring