Filip Hoffmann, Developer in Warsaw, Poland
Filip is available for hire
Hire Filip

Filip Hoffmann

Verified Expert  in Engineering

Back-end Developer

Location
Warsaw, Poland
Toptal Member Since
September 21, 2022

Filip is a senior back-end developer with over five years of professional experience. He has built back ends with Flask, Django, and Pyramid, specializing in data-intensive applications. In his work, Filip follows best practices like test-driven development (TDD) and domain-driven design (DDD) to deliver reliable and maintainable software.

Portfolio

ATC Research
Python, Apache Airflow, Web Scraping, ETL, PostgreSQL, Docker...
Snowflake
TypeScript, Snowflake, Big Data, API Integration, Integration, Databases, SQL...
Polidea
Python, Bazel, Docker, Java, Google Cloud Platform (GCP), Kubernetes, Back-end...

Experience

Availability

Part-time

Preferred Environment

MacOS, Slack, Python 3, Flask, SQL, Snowflake, PostgreSQL, Docker, Linux, SQLAlchemy

The most amazing...

...thing I've developed is a universal website testing tool that reduces the need to test the websites' front ends manually.

Work Experience

Python Developer

2022 - 2023
ATC Research
  • Created and maintained a data scraping infrastructure in the form of Apache Airflow DAGs deployed on Amazon Managed Workflows for Apache Airflow.
  • Improved application testability by creating more than 100 new tests, covering new and existing code.
  • Introduced several good practices into the project, including automatic Black formatting checks in a GitHub CI pipeline and isolation of SQLAlchemy entities from the business code.
Technologies: Python, Apache Airflow, Web Scraping, ETL, PostgreSQL, Docker, Amazon S3 (AWS S3), AWS Lambda, Data Warehousing, Back-end Development, Back-end Performance, Back-end Architecture, Data Analytics, PIP, Amazon EC2

Software Developer

2021 - 2022
Snowflake
  • Developed and tested features for the Snowflake platform.
  • Tracked and fixed bugs reported by clients via Slack.
  • Helped with the recruitment and onboarding of new team members.
Technologies: TypeScript, Snowflake, Big Data, API Integration, Integration, Databases, SQL, JavaScript, OLAP, Object-oriented Programming (OOP), Domain-driven Design (DDD), Pytest, NPM, Jest, Kubernetes, Node.js, Back-end Development, Back-end Performance, Data Warehousing, Back-end Architecture, Data Analytics, Software Packaging, PIP

Python Developer

2020 - 2021
Polidea
  • Developed scraper-based website testing tools that reduced the need for manual website testing.
  • Found and fixed bugs in the Apache Knox rewrite rules.
  • Contributed to the development of Thrift to gRPC proxy.
  • Contributed to an Apache Airflow open source project by writing Azure files to the GCS transfer operator.
Technologies: Python, Bazel, Docker, Java, Google Cloud Platform (GCP), Kubernetes, Back-end, Object-oriented Programming (OOP), Groovy, Groovy Scripting, Spock Framework, Spock, Apache Thrift, gRPC, Python 3, MacOS, Web Scraping, Web Crawlers, Scraping, Scrapy, Apache Airflow, Back-end Development, Back-end Performance, ETL, Back-end Architecture, Open Source, PIP

Python Developer

2019 - 2020
Growbots
  • Wrote Airflow DAGs to move data from PostgreSQL to BigQuery.
  • Helped maintain the data-gathering infrastructure.
  • Improved the test suite speed by 50% by reusing the existing PostgreSQL Docker image instead of creating a new one from scratch every time.
Technologies: Python, Google BigQuery, Google Cloud Platform (GCP), Nomad, Docker, Django, Flask, PySpark, Apache Airflow, PostgreSQL, Pub/Sub, Back-end, RESTful Microservices, Microservices, SQL, Big Data, OLAP, OLTP, Makefile, Grafana, Domain-driven Design (DDD), Object-relational Mapping (ORM), SQLAlchemy, Object-oriented Programming (OOP), Python 3, Pytest, Linux, Bash Script, Web Scraping, REST, Web Crawlers, Scraping, APIs, RabbitMQ, REST APIs, Spark, Apache Spark, GitLab CI/CD, CI/CD Pipelines, Microservices Architecture, Back-end Development, Back-end Performance, Back-end Architecture, Data Analytics, PIP, BigQuery

Python Developer

2017 - 2019
Daftcode
  • Created an SMS-responding microservice from scratch based on the Pyramid framework.
  • Maintained and wrote technical documentation for mobile subscription and the SMS sending API.
  • Contributed to the makeover of a production server procedure that fastened recovering the failed EC2 instance by 75%.
Technologies: Python, Pyramid, Ansible, NGINX, Docker, PostgreSQL, Grafana, Terraform, APIs, Back-end, SQL, Amazon Web Services (AWS), OLTP, Object-oriented Programming (OOP), Object-relational Mapping (ORM), API Integration, Integration, Python 3, SQLAlchemy, Pytest, Linux, Bash Script, REST, REST APIs, GitLab CI/CD, CI/CD Pipelines, Back-end Development, Back-end Performance, Amazon S3 (AWS S3), Back-end Architecture, PIP, Amazon EC2

Tech Support

2016 - 2017
Dealavo
  • Created an app that automated the previously manual client file verification process.
  • Wrote and maintained configuration scripts for static website data scrapers.
  • Wrote custom scrapers to extract product data from dynamic websites.
  • Onboarded, tutored, and supervised new team members.
Technologies: Scrapy, Web Crawlers, Scraping, Web Scraping, Selenium, Linux, Python, Python 2, Flask, Regex, XPath, PostgreSQL, SQLAlchemy, PIP

Activity Registrator

https://github.com/pnh-activity-registrator
Pilkanahali (PNH) is a Polish web platform that gathers sports enthusiasts and enables them to team up and play soccer, basketball, volleyball, and other team sports. Users must register for every single game, and for popular locations, there are a lot of people competing for the spots. I love to play football, and I was very frustrated that signing up for a game in my neighborhood was always very difficult due to the number of people interested in playing. Therefore, I wrote an application that waits for the new sign-ups to get posted and registers me automatically.

The application was built and deployed to the Docker Hub repository and run on a GCP instance created with Terraform script. It queried the PNH portal for the new events within a specified date range and location, and if there were available spots, it would sign the user up.

Failover Procedure for Subscription Processing API

The subscription processing API was deployed as a pyramid application on two EC2 instances sitting behind the load balancer. My team had experienced some random EC2 instance crashes before. Hence, another senior developer and I created the failover procedure that destroyed the malfunctioning server and set up a new server bound to the same elastic IP address that the crashed server used. We described the failover instance in Terraform and left the code commented, so in case of failover, the on-duty developer had to comment out the code for the crashed instance and uncomment the code for the failover instance, rerun the Terraform script and deploy the application to a new server with standard deployment procedure. The whole procedure was described and very easy to follow, reducing the human error possibility during the server crash.

In this project, I was responsible for describing the failover infrastructure in the Terraform scripts and writing the manual for on duty developer.

The procedure proved very useful as one of our servers crashed the day after we finished it. We got it back online in around 10 minutes.

Scraper-based Website Testing Tool

The client needed a universal tool that would automatically verify if all of the clickable links on their websites return 200 status code so they can test if the website works correctly after some back-end reworks. Me and two other developers created a Python application that used Scrapy and pytest to recursively go through all of the URLs on a website, starting from the root URL and returning the test report with the URL mapped to its corresponding response code. As some of the client websites were difficult to scrape 100% automatically, we included the possibility to extend the test case with the following scenario: a set of requests to the website that were made before Scrapy started its run to prepare the website and its content (for example, create the new user and submit some data).

In this project, I was responsible for designing the whole application from scratch, implementing the scraper, creating Bazel build script, and helping my colleagues whenever necessary.

Libraries/APIs

SQLAlchemy, Requests, REST APIs, Node.js, PySpark, Asyncio, Python Asyncio

Tools

Pytest, Bazel, Apache Airflow, Terraform, NPM, Ansible, NGINX, Grafana, Makefile, RabbitMQ, GitLab CI/CD, BigQuery

Frameworks

Flask, Pyramid, Scrapy, Jest, Selenium, Django, Spock Framework, Spock, Apache Thrift, gRPC, Spark, Apache Spark

Languages

SQL, Snowflake, TypeScript, Python, Python 3, JavaScript, Java, Bash Script, Groovy, Python 2, Regex, XPath

Paradigms

Object-relational Mapping (ORM), Object-oriented Programming (OOP), REST, Microservices, OLAP, Microservices Architecture, ETL, Back-end Architecture

Platforms

Docker, Linux, Amazon Web Services (AWS), Amazon EC2, MacOS, Google Cloud Platform (GCP), Kubernetes, AWS Lambda

Storage

PostgreSQL, Databases, OLTP, Amazon S3 (AWS S3)

Other

APIs, Back-end, API Integration, Web Scraping, Scraping, Back-end Development, Back-end Performance, RESTful Microservices, Web Crawlers, Data Warehousing, PIP, Google BigQuery, Nomad, Pub/Sub, Networks, Big Data, Integration, Domain-driven Design (DDD), Groovy Scripting, CI/CD Pipelines, Data Analytics, Open Source, Software Packaging

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring