Yuriy Markiv, Developer in Warlubie, Poland
Yuriy is available for hire
Hire Yuriy

Yuriy Markiv

Verified Expert  in Engineering

Bio

Yuriy is a database developer and data engineer with 11 years of experience. He specializes in relational database concepts, estimation, implementation planning, development, integration, and deployment. He is proficient in Python, Flask, REST API, SQL, PL/SQL, Transact-SQL, Oracle, SQL Server, PostgreSQL, MySQL, Snowflake, Redshift, Hive, SAP, ETL, reverse ELT, and Unix. Yuriy is experienced in cloud computing, data modeling and quality, automated testing, data migration, and BI.

Portfolio

A German Startup
Python, Python 3, Flask, REST APIs, JavaScript, Jinja, HTML, SQLAlchemy, Pandas...
TCS / SAP Labs
Python 3, Python, Bash, Bash Script, MySQL, SQLite, Linux, DB, Data Migration...
Dressler Consulting
Sisense, Periscope, Redshift, Amazon EC2, Flask, Python 3, REST APIs...

Experience

  • SQL - 8 years
  • PL/SQL - 8 years
  • API Integration - 7 years
  • MySQL - 5 years
  • PostgreSQL - 3 years
  • Python 3 - 3 years
  • Database Migration - 3 years
  • Flask - 1 year

Availability

Part-time

Preferred Environment

Python 3, Amazon Web Services (AWS), Snowflake, PostgreSQL, Databricks, REST APIs, MySQL, Linux, Databases, Docker

The most amazing...

...thing I've coded is a risk management system for internet banking based on the Oracle database and REST API integration based on Flask and Python.

Work Experience

Senior Data and Database Engineer

2023 - PRESENT
A German Startup
  • Led the end-to-end Python, FastAPI, and Bash back-end development while managing JavaScript, Jinja2, and HTML front-end development, resulting in a highly efficient and user-friendly interface for the predictive sales software.
  • Constructed effective cloud connectors, fetching data from various sources, mapping the data, and uploading it into the REST API, thereby streamlining data integration processes.
  • Employed robust CI/CD methodologies with GitHub Actions to automate the software development process, improving development efficiency and product stability.
  • Created a Python app that simplifies user experience by managing complex tasks like SSH access to web apps. It combines SSH client and lightweight browser functionalities. It runs on minimal resources but can connect to a cloud for complex tasks.
Technologies: Python, Python 3, Flask, REST APIs, JavaScript, Jinja, HTML, SQLAlchemy, Pandas, SQL, MySQL, Oracle, SAP HANA, Linux, GitHub, CI/CD Pipelines, GitHub Actions, Azure, VMware, VirtualBox, Docker, Docker Compose, SAP, SSH Tunneling, PyQt 5, Database Optimization, Cloud Platforms, Dashboards, Microsoft Azure, MacOS, ETL Pipelines, Backup & Recovery, Data Extraction, CPG data, Data Management, Azure Virtual Machines, Azure Databricks, Cloud, ETL, FastAPI, REST, HTTP REST, Business Requirements, Data Transformation, Unit Testing, Pytest

Senior Data and Database Engineer

2022 - 2022
TCS / SAP Labs
  • Developed an integrated Python/Bash framework that significantly enhanced the efficiency of post-migration validation procedures for database servers.
  • Leveraged advanced GNU Linux tools to maintain, troubleshoot, and enhance system performance, showcasing robust technical acumen.
  • Streamlined database operations through a blend of Bash and Python scripting, substantially improving the performance and reliability of database operations.
Technologies: Python 3, Python, Bash, Bash Script, MySQL, SQLite, Linux, DB, Data Migration, Data Validation, Database Optimization, Cloud Platforms, Microsoft Azure, Data Extraction, Data Management, Azure Databricks, Cloud, Data Transformation, Unit Testing, Pytest

Senior Database Administrator | Data Engineer

2020 - 2022
Dressler Consulting
  • Consolidated data from microservices into DWH, created BI dashboards, and tuned objects in DWH to make them cached.
  • Created users, set access rights on different environments, and performed Amazon Relational Database Service replication.
  • Handled performance optimization and looked for missing indexes to optimize SQL queries.
  • Checked data quality and integrity to ensure we provide valid data for business users.
  • Created DB functions for financial forecasting to use them out of the box, allowing business users to utilize them.
  • Migrated from Redshift/Periscope into Snowflake/Power BI.
  • Created a custom ETL process based on Python to handle key-based replication deleted records and cross-instance data quality checks (PostgreSQL vs. Redshift and PostgreSQL vs. Snowflake).
  • Built a data pipeline alert process based on Python for notifying users about various events in the database, integrated with Mailjet (email service) and Slack.
  • Worked with Databricks and created about 60 data pipelines to fetch, transform, check, and ingest data. Worked with Snowflake and supported its data ingestion.
  • Created a basic web API based on Flask and Google Cloud. Worked on a HubSpot REST API integration using Flask/Python.
Technologies: Periscope, Sisense, Redshift, Amazon EC2, Flask, Python 3, REST APIs, API Integration, PostgreSQL, Snowflake, ETL, Talend Stitch, Data Warehousing, Database Architecture, Linux, Bash Script, Databricks, Jupyter Notebook, Pandas, Google Cloud, Microsoft Power BI, Business Intelligence (BI), BI Reports, Site Reliability Engineering (SRE), Data Engineering, Azure, Stored Procedure, SQL Stored Procedures, SQL DML, Data Queries, Machine Learning Operations (MLOps), Data Pipelines, Apache Hive, PySpark, Data Build Tool (dbt), CI/CD Pipelines, ELT, Data Lakes, Shell Scripting, Relational Databases, Database Backups, RDBMS, Ubuntu, Git, Google Sheets API, PL/pgSQL, Spark SQL, Spark, Database Administration (DBA), Data, Reporting, Databases, Query Optimization, ETL Tools, Scripting Languages, Scala, User-defined Functions (UDF), Data Modeling, Data Architecture, Amazon S3 (AWS S3), Report Development, Email, HTML, IMAP, HubSpot, HubSpot CRM, RESTful Microservices, Google Cloud Platform (GCP), Architecture, Database Optimization, Cloud Platforms, Dashboards, ETL Pipelines, Performance Tuning, Backup & Recovery, Data Extraction, Data Management, Azure Databricks, Cloud, REST, HTTP REST, Business Requirements, Data Transformation, Pytest, Real Estate

SQL Developer

2020 - 2020
Eleks
  • Worked actively on synthetic data and data obfuscation. Populated big amounts of tables with synthetic and obfuscated data—millions of records— while keeping some of them real so that its clients could continue training without personal data.
  • Created advanced batch scripts to handle data sync.
  • Developed automated testing based on tSQLt to see data integrity on each release.
Technologies: SQL Server DBA, SQL Server Integration Services (SSIS), XML, Windows PowerShell, Regex, tSQLt, Python, T-SQL (Transact-SQL), SQL, SQL DML, Relational Databases, RDBMS, Database Administration (DBA), Databases, Query Optimization, Scripting Languages, User-defined Functions (UDF), Data Architecture, Email, Database Optimization, Cloud Platforms, Microsoft Azure, Performance Tuning, Data Management, Leadership, Azure Virtual Machines, Cloud, Business Requirements, Unit Testing

Senior SQL and Data Warehouse Developer

2019 - 2019
Zealic Solutions
  • Used SQL Server to perform a backup, restore, check, and interpret audit tables, create users, and grant access rights. Used PostgreSQL RDS and self-hosted instance to enable and configure logical decoding for audit usage.
  • Investigated using PySpark, Parquet, SQLite, Amazon Redshift, and Amazon S3. Generated Parquet files using PySpark from different database types, checked results using SQLite locally, uploaded them into Amazon, and made queries.
  • Supported MySQL and basic Perl scripting. Uploaded CSV files into remote MySQL API using Perl and data validation.
  • Integrated data and prepared back end to store SendGrid API data in PostgreSQL.
  • Contributed to data quality and cleaning, deduplicating, comparing records, and more.
  • Migrated custom scripts from SQL Server to PostgreSQL, mysqldump from legacy MySQL to production MySQL, and pgloader from legacy MySQL to stage PostgreSQL.
  • Administered Linux infrastructure, obtained HTTPS certificate from Let's Encrypt, configured SSH tunnels to access databases, and configured virtual hosting for API and secure file transfer protocol (SFTP).
  • Performed bash scripting, automated uploading CSV data via SFT into a database, reported fetch into a password-protected web directory, and validated the last date fetch.
Technologies: SQL Server DBA, PostgreSQL, Amazon Web Services (AWS), SSL Certificates, Bash Script, Reports, SQL DML, Data Integration, Cloud Migration, Quality Assurance (QA), Amazon RDS, Data Migration, Relational Databases, Database Backups, RDBMS, Ubuntu, PL/pgSQL, Database Administration (DBA), Data, Reporting, Databases, Query Optimization, Scripting Languages, User-defined Functions (UDF), Data Modeling, Data Architecture, Report Development, Email, HTML, Database Optimization, Cloud Platforms, Dashboards, ETL Pipelines, MySQL, Performance Tuning, Backup & Recovery, Data Extraction, Data Management, Cloud, ETL, REST, HTTP REST, Business Requirements, Data Transformation

SQL Developer | Data Integrator

2016 - 2019
NDA
  • Integrated data from various sources, including REST APIs and SOAP, into data warehouse storage of different types.
  • Created SQL and REST connectors for ETL and reversed ETL data pipelines.
  • Analyzed REST API and SOAP to estimate the effort of APIs, including Adform, Affilinet, Google Sheets, Pingdom, Intercom, Xero, Typeform, NewVoiceMedia, Searchmetrics, Facebook, and Splunk.
  • Requested credentials and refresh token, filled data in API and UI to see final results if no Sandbox present, deprecated controls, and upgraded connector if needed.
  • Created and updated documentation to cover SQL and REST connectors' functionality.
Technologies: Crucible, Jira, VPS/VDS, Windows, Unix, Apache Maven, JSON, XML, SQL, HTTP, Web Services, MemSQL, Redshift, Oracle, Microsoft SQL Server, MySQL, PostgreSQL, APIs, T-SQL (Transact-SQL), SQL DML, Swagger, Postman, JSON API, CSV File Processing, CI/CD Pipelines, Relational Databases, RDBMS, Ubuntu, Git, Google Sheets API, Data, Databases, ETL Tools, Scripting Languages, User-defined Functions (UDF), Email, Database Optimization, Cloud Platforms, ETL Pipelines, Data Extraction, Data Management, Cloud, ETL, REST, HTTP REST, Business Requirements, Data Transformation, Unit Testing

Oracle PL/SQL Developer

2014 - 2016
PLS Logistics
  • Built and maintained Oracle database objects required by the business.
  • Provided L3 support to business users to make sure their issues were covered.
  • Created Oracle database objects documentation to ensure all features were described.
  • Maintained data warehouse to ensure all data from transactional instances appeared in reporting layer.
  • Created new and supported existing reports so that users get actual information in a convenient way.
  • Ensured data quality and performed validation of related pipelines.
  • Created Unix shell scripting for various processes.
Technologies: Java, Unix Shell Scripting, Microsoft Excel, Data Warehousing, Data Warehouse Design, Documentation, DB, PL/SQL, SQL, Oracle, Reports, Operations, SQL DML, SQL DDL, Scaled Agile Framework (SAFe), Scrum, Relational Databases, RDBMS, Oracle PL/SQL, PL/SQL Developer, Ubuntu, Git, Database Administration (DBA), Reporting, Databases, Query Optimization, Scripting Languages, User-defined Functions (UDF), Report Development, Email, Database Optimization, Dashboards, ETL Pipelines, Performance Tuning, Data Extraction, CPG data, Data Management, Data Transformation

Database Software Engineer

2011 - 2014
Volksbank
  • Implemented a corporate reporting system based on Oracle Business Intelligence 11g, methodology of Basel II and ARZ.
  • Created and refined existing Oracle 11g packages, procedures, views, tables, indexes, triggers, jobs, functions, and performance tuning.
  • Implemented, improved, and supported data warehouse according to business needs.
  • Provided data aggregation in Oracle OLAP according to requirements.
  • Generated various reports using SAP BusinessObjects, Oracle Business Intelligence, CS B2, FastReport .NET, and Excel VBA.
  • Created simple applications with C# to select data from Oracle and save it in Excel format based on customers' needs.
  • Supported the front-end system, including printing forms and table settings.
  • Developed and supported fraud prevention alarm system for internet banking, including email and SMS information.
  • Implemented client risk evaluation system based on information about client's operations.
  • Gathered data from 3rd-party text files using Excel VBA, built a consolidated summary, and created related graphs to visualize the result.
Technologies: C#, Visual Basic for Applications (VBA), Microsoft Excel, FastReport, SAP BusinessObjects Data Service (BODS), Oracle Business Intelligence Enterprise Edition 11g (OBIEE), T-SQL (Transact-SQL), Microsoft SQL Server, PL/SQL, SQL, Oracle, Finance, Oracle Database, Relational Databases, RDBMS, Oracle PL/SQL, PL/SQL Developer, Reporting, Databases, Query Optimization, Scripting Languages, User-defined Functions (UDF), Data Modeling, Data Architecture, Visual Basic, Report Development, Email, SAP, Database Optimization, Dashboards, Microsoft Word, Performance Tuning, Data Extraction, Data Management, Business Requirements, Key Performance Indicators (KPIs)

Database and Reporting Specialist

2007 - 2011
Ukreximbank
  • Wrote SQL requests for analysis to meet business requirements.
  • Tackled reporting requirements by fetching data from an Oracle database into Excel and Access and creating multiple separate Excel sheets based on various categories. Demonstrated a deep understanding of both Excel and database architecture.
  • Prepared technical tasks for creating new and upgrading existing banking software based on Oracle.
  • Created management accounting parameters to meet business requirements.
Technologies: Banking & Finance, Visual Basic for Applications (VBA), Microsoft Access, Microsoft Excel, Oracle, Data Analysis, Database Analytics, Finance, Relational Databases, RDBMS, Oracle PL/SQL, PL/SQL Developer, Reporting, Databases, Scripting Languages, Visual Basic, Report Development, Email, Microsoft Word, Key Performance Indicators (KPIs)

Lead Analyst

2006 - 2007
Public Joint Stock Company Bank Finance and Credit
  • Prepared technical tasks for creating new and upgrading existing banking software based on Oracle.
  • Developed scoring of financial indicators for Central Bank.
  • Managed controlling of the reserve, foreign exchange position.
  • Provided calculation of capitalization needs, including forecasting.
Technologies: Banking & Finance, Visual Basic for Applications (VBA), Microsoft Access, Microsoft Excel, Oracle, Finance, Analytics, Oracle PL/SQL, Reporting, Databases, Report Development, Email, Microsoft Word, Leadership, Key Performance Indicators (KPIs)

Analyst

2004 - 2006
Pravex bank
  • Prepared technical tasks for creating new and refining existing banking software based on Oracle.
  • Set up scoring of financial indicators, including Basel II.
  • Oversaw controlling of reserve and foreign exchange position.
  • Created business process modeling according to management requirements.
Technologies: Software, Microsoft Excel, Oracle, Finance, Analytics, Report Development, Email, Microsoft Word, Leadership, Key Performance Indicators (KPIs)

Manager | Local Network Administrator

2002 - 2004
Duox
  • Managed local computer network operation (switch, ethernet wire, network cards, uplink to ISP).
  • Installed and supported Unix server (proxy, pop3, SMTP, shared drive under samba).
  • Configured and supported scanner, printer, and fax.
  • Provided oral and written translation between English, Polish, basic Slovak, and basic Czech.
Technologies: Squid, Samba, POP3, SMTP, TCP/IP, Windows, FreeBSD, Linux, Email, HTML, Microsoft Word

Personal VPS Distributed Network

Personal self-development project (occasionally)

Duration: Aug 2015 - present

Description: Running personal VPS distributed network (including Raspberry Pi devices) based on Ubuntu: virtual hosting used for personal sites and projects (Apache2), TT-RSS (RSS aggregator), analytics (Piwik), messenger (Ejabberd), socks5 proxy via ssh, mail server (exim4), webmail (SquirrelMail), database server (MySQL), process supervision tool (Monit), distributed file storage (Syncthing), IPv6 implementation, etc.

Internet Banking Anti-fraud (Enjoy Security)

Freelance project (part-time)

Duration: Jul 2013 – Jul 2014

Company (employer): Enjoy Security

Clients (customers): Various banks (NDA)

Project description: Developing and supporting of fraud prevention alarm system for internet banking.

Role: Developer

Responsibilities: Technical task clarification, development using SQL and PL/SQL, writing documentation, and supporting customers.

Technologies and methodologies used by the employee in the project: Oracle SQL and PL/SQL.

Company's Unix-like Server

Duration: Jan 2003 - Feb 2004

Company (employer): Duox

Client (customer): Company employees

Project description: Set up Unix server for company purposes based on FreeBSD, including proxy/squid, email POP3 and SMTP, and Samba/shared drive.

Role: Server administrator

Responsibilities: Setting up, monitoring, administrating rights

Technologies and methodologies used by the employee in the project: Unix shell scripting, FreeBSD

Prepared Technical Tasks for Creating New and Refining Existing Banking Software Based on Oracle (Finance and Credit Bank)

Duration: Sep 2006 - Feb 2007

Company (employer): Finance and Credit Bank

Client (customer): Economic norms department, bank management

Project description: Prepared technical tasks for creating new and refining existing bank software based on Oracle (e.g. foreign currency position controlling, credit norms planning, etc.)

Role: BA/QA

Responsibilities: Modeling of calculations in Excel, preparing technical tasks and negotiation with all involved people, filing settings, quality control

Technologies and methodologies used by the employee in the project: Oracle, math, banking norms.

Prepared Technical Tasks for Creating New and Refining Existing Banking Software Based on Oracle (Ukreximbank)

Duration: Apr 2008 - Feb 2011

Company (employer): Ukreximbank

Client (customer): Management accounting department, bank management

Project description: Prepared technical tasks for creating new and refining existing bank software based on Oracle (e.g. fixing automatic accounting parameters, credits management accounting check, etc.)

Role: BA/QA

Responsibilities: Modeling of calculations in Excel, preparing technical tasks and negotiation with all involved people, filing settings, quality control

Technologies and methodologies used by the employee in the project: Oracle, math, banking norms.

Client Risk Evaluation System Based on Information About Client's Operations (Volksbank)

Duration: Sep 2011 - May 2014

Company (employer): Volksbank

Client (customer): Risks department, bank management

Project description: Created and supported client risk evaluation system based on information about the client's operations.

Role: Developer
Responsibilities: Discussed technical tasks, development using PL/SQL, displaying the result as SAP Business Objects and CS/B2 report/settings, supporting customers

Technologies and methodologies used by the employee in the project: Oracle SQL and PL/SQL, SAP Business Objects universe and reports, CS/B2 (proprietary banking software)

Implementation of BI/Reporting System Based on OBIEE (Oracle BI) (Volksbank)

Duration: Oct 2011 - June 2014

Company (employer): Volksbank

Client (customer): Various departments, bank management

Project description: Implementation of BI/reporting system based on OBIEE (Oracle BI)

Role: Developer

Responsibilities: Analyze existing reporting systems, discussing technical tasks, set up system from scratch, development using PL/SQL, migrating existing reports from different systems, supporting customers

Technologies and methodologies used by the employee in the project: Oracle SQL and PL/SQL, OBIEE

Reporting System Maintenance (PLS Logistics Services)

Duration: Nov 2014 - Aug 2016

Company (employer): PLS Logistics Services

Client (customer): Various departments, company management

Project description: Reporting system maintenance

Role: Developer

Responsibilities: Creating new and supporting existing reports, analyze and debug, data quality and validation, working with DWH

Technologies and methodologies used by the employee in the project: Oracle SQL and PL/SQL, Cognos, Excel, Unix shell scripting

Data Integration

Duration: Sep 2016 - Mar 2019

Company (employer): Product company (NDA)

Client (customer): Various enterprise companies

Project description: Integrate data from various sources, including APIs (REST, SOAP) into DWH storage of different types

Role: Developer

Responsibilities: Analyze API (REST/SOAP) in order to estimate the effort, request credentials including refresh token, etc., fill data in API/UI in order to see final results if no sandbox present, control deprecation and upgrade connector if needed, create/update documentation, support customers (occasionally as main tasks are development), deploy and test (including in UI).

Technologies and methodologies used by the employee in the project: WebServices, Postgres, MySQL, Oracle, SQL Server, JSON, XML

Internet Banking Anti-fraud (Volksbank)

Duration: Aug 2012 – Aug 2013

Company (employer): Volksbank

Client (customer): Security Department of Volksbank

Project description: Developing and supporting of fraud prevention alarm system for internet banking (including email and SMS-informing).

Role: Developer

Responsibilities: Discuss technical task and clarify details with customers, development using PL/SQL, displaying of the result as Oracle BI analysis, informed via email and SMS, supported customers.

Technologies and methodologies used by the employee in the project: Oracle PL/SQL, Oracle BI

Prepared Technical Tasks for Creating New and Refining Existing Banking Software Based on Oracle (Pravex Bank)

Duration: Dec 2004 - Jan 2006

Company (employer): Pravex Bank

Client (customer): Economic norms department, bank management

Project description: Prepared technical tasks for creating new and refining existing bank software based on Oracle (e.g. foreign currency position controlling, bank reserves monitoring and planning, etc.).

Role: BA/QA

Responsibilities: Modeling of calculations in Excel, preparing technical tasks and negotiation with all involved people, filing settings, quality control

Technologies and methodologies used by the employee in the project: Oracle, math, banking norms.
1998 - 2003

Master's Degree in Finance

National University Lviv Polytechnic - Lviv, Ukraine

MAY 2010 - PRESENT

ANSI SQL Programmer

Сomputer Training Center at Bauman MSTU

Libraries/APIs

JSON API, Xero API, PySpark, REST APIs, Pandas, Intercom API, Facebook API, Google Sheets API, Typeform.io, SQLAlchemy, PyQt 5

Tools

Postman, Pytest, Microsoft Power BI, Spark SQL, Docker Compose, Google Sheets, Squid, Microsoft Excel, Microsoft Access, FastReport, Crucible, Periscope, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Subversion (SVN), Git, Jira, Apache Maven, Splunk, Sisense, GitHub, VMware, VirtualBox, Microsoft Word

Languages

Python 3, Python, SQL, Stored Procedure, SQL DML, SQL DDL, Snowflake, T-SQL (Transact-SQL), Visual Basic, HTML, Visual Basic for Applications (VBA), C#, XML, Bash Script, Regex, PL/pgSQL, Excel VBA, Java, Scala, Bash, JavaScript

Paradigms

ETL, Database Design, REST, Unit Testing, Business Intelligence (BI), Samba, Scrum

Platforms

Databricks, Amazon Web Services (AWS), Oracle Database, HubSpot, Windows, Linux, FreeBSD, Oracle, Unix, Amazon EC2, Talend Stitch, Jupyter Notebook, Ubuntu, Azure, SAP HANA, Docker, Google Cloud Platform (GCP), MacOS

Storage

Oracle PL/SQL, PL/SQL Developer, MySQL, PL/SQL, SQL Stored Procedures, Database Administration (DBA), Databases, Database Testing, Data Lakes, Data Integration, Data Pipelines, Relational Databases, RDBMS, Apache Hive, Amazon S3 (AWS S3), Database Migration, DB, Microsoft SQL Server, Redshift, MemSQL, JSON, Google Cloud, SQL Server DBA, SQL Server Integration Services (SSIS), Database Architecture, PostgreSQL, Database Backups, SQLite, Data Validation

Frameworks

Flask, Windows PowerShell, Spark, Swagger, Scaled Agile Framework (SAFe), Jinja

Industry Expertise

Banking & Finance

Other

API Integration, Data Warehousing, APIs, Data Engineering, Data, Data Wrangling, Reporting, CRM APIs, Data Transformation, Data Architecture, Shell Scripting, Data Migration, Amazon RDS, Data Modeling, ELT, CSV File Processing, Cloud Migration, Data Queries, Finance, Scripting Languages, User-defined Functions (UDF), Report Development, Email, SAP, Database Optimization, Cloud Platforms, Dashboards, ETL Pipelines, Backup & Recovery, Data Extraction, CPG data, Data Management, Leadership, Azure Virtual Machines, Azure Databricks, Cloud, HTTP REST, Real Estate, BI Reports, Reports, Operations, Site Reliability Engineering (SRE), Data Visualization, Data-driven Dashboards, Machine Learning Operations (MLOps), Data Analysis, Database Analytics, CI/CD Pipelines, Analytics, Query Optimization, ETL Tools, IMAP, HubSpot CRM, RESTful Microservices, SSH Tunneling, Microsoft Azure, Performance Tuning, Business Requirements, Key Performance Indicators (KPIs), TCP/IP, SMTP, POP3, Software, Documentation, SAP BusinessObjects Data Service (BODS), Web Services, HTTP, VPS/VDS, Data Warehouse Design, SSL Certificates, tSQLt, Adform, NewVoiceMedia API, Unix Shell Scripting, SAP BusinessObjects (BO), Web Development, Data Build Tool (dbt), Quality Assurance (QA), GitHub Actions, Architecture, FastAPI

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring