Arthur Flores Duarte, Developer in Florianopolis - SC, Brazil

Arthur Flores Duarte

Software Developer

Florianopolis - SC, Brazil
Toptal Member Since
June 18, 2020

Arthur has over 15 years of experience in software development and system analysis and seven years working with SQL and data analysis. He's experienced with Oracle, BigQuery, Snowflake, and other data sources. He also has a solid DBT, Python, data engineering, and business intelligence background. In addition to excellent relationships with work teams and customers, Arthur is very committed, adaptable, and creative.

Arthur is available for hire
Hire Arthur


Appex Group, Inc.
Snowflake, Data Warehousing, Data Warehouse Design, Data Engineering...
SimplyWise (via Toptal)
Amplitude, Redshift, Python, Data Build Tool (dbt), Tableau, Amazon S3 (AWS S3)...
205 Data Lab
Excel VBA, Data Build Tool (dbt), Prefect, Python, Snowflake, Salesforce...


Florianopolis - SC, Brazil



Preferred Environment

SQL, Amazon Web Services (AWS), Snowflake, Python, Data Build Tool (dbt), Apache Airflow

The most amazing...

...project I've designed and built was a data architecture to help a startup make data-driven decisions using Fivetran, Python, DBT, Redshift, and Tableau.

Work Experience

2022 - PRESENT

Snowflake Data Engineer

Appex Group, Inc.
  • Designed and built a new data architecture for the company in collaboration with the new data team, helping automate data ingestion, modeling, and visualization.
  • Implemented dbt (data build tool) for data modeling and trained data analysts on how to use it properly.
  • Set up Fivetran connectors from several data sources to a Snowflake data warehouse.
  • Implemented Airflow DAGs (MWAA) to run and test dbt models and for custom data ingestions using API calls.
  • Built controls for data quality checks in the extracted sources using dbt tests and Airflow.
Technologies: Snowflake, Data Warehousing, Data Warehouse Design, Data Engineering, Data Build Tool (dbt), Apache Airflow, GitHub, Fivetran, Looker, Data Architecture, Data Modeling, Amazon Web Services (AWS), SQL, Python
2022 - 2022

Data Engineer

SimplyWise (via Toptal)
  • Designed and built a data architecture to help the company make data-driven decisions.
  • Integrated multiple data sources such as Apple Search, Google Ads, MySQL, and Amplitude into a Redshift data warehouse.
  • Configured data integration services like Fivetran and Stitch to collect different data sources.
  • Developed a Python data pipeline to read and parse JSON files, upload them to Amazon S3, and load them into Redshift tables.
  • Installed and configured the data-building tool to perform data transformation through ELT data modeling.
  • Built dashboards and reports using Tableau Online.
Technologies: Amplitude, Redshift, Python, Data Build Tool (dbt), Tableau, Amazon S3 (AWS S3), Amazon EC2, Fivetran, Stitch Data, SQL, ELT, ETL, Data Pipelines, Data Engineering, Data Architecture
2020 - 2022

Analytics Engineer

205 Data Lab
  • Worked 100% remotely for a US-based company, providing services for San Francisco Bay data customers as an analytics engineer.
  • Created automated custom data reports using Python and Excel VBA scripts.
  • Extracted data for reports from Presto DB and Snowflake using complex SQL scripts.
  • Transformed and modeled raw data through ELT processes using data-building tools.
  • Developed Python scripts for Prefect Cloud to automate and orchestrate data reports.
  • Integrated data between Snowflake and Salesforce to generate reports using Bulk API.
Technologies: Excel VBA, Data Build Tool (dbt), Prefect, Python, Snowflake, Salesforce,, SQL, Data Modeling, BI Reports
2020 - 2020

Data Engineer

Projeto 22
  • Worked as a part-time data engineer, supporting the data squad on a new data lake project for a vehicle sales company WebMotors.
  • Supported the data engineering team providing AWS resources: EC2, S3, VPC, RDS using Aurora PostgreSQL, and Redshift.
  • Developed Cloud Formation templates to automate AWS resources provisioning.
  • Deployed an Apache NIFI cluster for data ingestion processing using Zookeeper and NiFi Registry.
  • Integrated AWS DMS (data migration service) mapping tables from SQL Server and Aurora MySQL to S3 buckets in parquet files.
  • Employed an Apache Airflow and EMR cluster to support complex data processing.
  • Built CloudWatch alarms for servers and applications monitoring.
  • Used Lambda and Boto3 to automatically stop/start EC2 and RDS instances according to schedule, reducing costs.
  • Developed Python scripts for several purposes, such as database stress tests.
Technologies: Amazon Web Services (AWS), Apache Airflow, Apache NiFi, EMR, Document Management Systems (DMS), PostgreSQL, Relational Database Services (RDS), Amazon Virtual Private Cloud (VPC), Amazon S3 (AWS S3), Amazon EC2
2019 - 2020

Data Analyst

  • Analyzed data using complex SQL queries on Google BigQuery.
  • Presented data analytics reports for managers using dashboards from Mode Analytics.
  • Defined business metrics to support and determine company OKRs.
  • Provided monitoring and decision support reports for different areas such as growth, product, and operations.
  • Worked on geospatial analysis for scooters and generated map charts using Python and Jupyter Notebook.
  • Supported A/B tests to analyze the adoption of product features.
Technologies: Dashboards, Amplitude, Jupyter Notebook, Python, Mode Analytics, BigQuery, SQL
2018 - 2019

Solution Integration Architect

Optiva Inc.
  • Worked 100% remotely as part of a global professional services team (English and Spanish speaking teams).
  • Integrated and configured DCRM and portals for telecom customers in different countries.
  • Wrote system integration tests, user acceptance tests, test scenarios, configuration handbooks, user training, and production rollout documents.
  • Analyzed data from different CRM environments, exporting and comparing data into Excel using formulas such as VLOOKUP to troubleshoot missing configuration parameters.
Technologies: BSS, Portals, ASPX, Microsoft Dynamics CRM, Microsoft SQL Server
2011 - 2018

Data Analyst | Technical Leader

Wedo Technologies
  • Worked with large volumes of data. Found revenue leakages and trends between telecom systems, applying data analytics techniques.
  • Integrated several telecom systems into the RAID ETL tool, reading different data sources(Oracle, SQL Server, Excel, CSV, ASN1), containing telecom events and customer data.
  • Performed Oracle DB, SQL performance tuning, procedures using PL/SQL, and wrote Python and Shell scripts.
  • Provided system analysis and solution design documentation, scope and architecture definition, and technical proposals for sales.
  • Designed user reports, dashboards, and KPIs to support data-driven decisions and identify revenue leakages.
  • Oversaw unit tests, integration tests, UAT, and production rollout. Completed technical and functional training for customers and teams.
  • Provided technical leadership and project management in different projects.
  • Worked on various projects for telecommunications customers in different countries, such as Brazil, Chile, and Peru.
Technologies: PL/SQL, Business Intelligence (BI), ETL, Python, Shell, Unix, RAID, Microsoft SQL Server, Oracle
2009 - 2011


  • Contributed to Delphi programming (MVC, object-oriented) for video surveillance software.
  • Integrated different kinds of devices such as IP cameras and video encoders.
  • Led network protocol integration using CGI, SOAP, HTTP, TCP, and RTSP.
  • Reverse-engineered protocols using Wireshark. Decoded video and audio using VLC libraries.
Technologies: CGI, SOAP, Network Protocols, TCP/IP, Delphi
2004 - 2007

Trainee | System Analyst

Alliance Consutoria
  • Developed software with Uniface language—Compuware.
  • Integrated databases including Oracle, SQL Server, and DB2.
  • Contributed to development using Agile methodologies.
  • Participated in level two CMMI project implementation.
Technologies: IBM Db2, Microsoft SQL Server, Oracle, Uniface


Data Analyst for a Toptal Client

Designed a geospatial dashboard (map) for a vehicle sharing company, displaying the user adoption and behavior change related to defined parking places.

The data was extracted made using SQL and using Google BigQuery geospatial libs. The dashboard was made using Mode, Python, and Jupyter Notebook.

Data Engineer for Fin Tech Company

Designed and built a data architecture to help the company make data-driven decisions. I integrated multiple data sources such as Apple Search, Google Ads, MySQL, and Amplitude into an AWS Redshift data warehouse.

Configured data integration services like Fivetran and Stitch to collect different data sources and developed a data pipeline using Python to read and parse JSON files, upload them to AWS S3 and load them into Redshift tables. I installed and configured the DBT (data building tool) to perform data
transformation (ELT data modeling) and designed the dashboards and reports using Tableau Online.

New Data Architecture

I supported the company in building a new data architecture in two stages for data ingestion automation, storing all the data in one data warehouse and supporting the data analysis team to focus on the business intelligence side.

In the first stage, we made the data available for data analysts as soon as possible using Fivetran and storing it in Snowflake.

In the second stage, we built custom connectors using Airflow and implemented more organized data modeling with dbt and SQL.



Snowflake, SQL, Python, Delphi, Excel VBA


Business Intelligence (BI), Dimensional Modeling, ETL, Database Design


Oracle SQL, PL/SQL, Databases, Relational Databases, ANSI SQL, Oracle RDBMS, Amazon S3 (AWS S3), Microsoft SQL Server, Oracle PL/SQL, Redshift, JSON, Data Pipelines, SQL Server 2012, MySQL, PostgreSQL


Data Build Tool (dbt), Dashboards, Data Visualization, Data Analysis, Analytics, Visualization, Data Modeling, Fivetran, Data Engineering, Telecom Business Support Systems (BSS), Revenue Assurance, Key Performance Indicators (KPIs), BI Reports, Data Analytics, Google BigQuery, Mode Analytics, ELT, Data Architecture, Business Intelligence (BI) Platforms, ETL Tools, Data Warehousing, Data Warehouse Design, Performance Tuning, TCP/IP, Network Protocols, Amplitude, Relational Database Services (RDS), Prefect, Shell Scripting, Dynamics CRM 365, IT Project Management, Software Engineering, Query Optimization

Libraries/APIs, Pandas, NumPy


Apache Airflow, BigQuery, Microsoft Excel, GitHub, Microsoft Dynamics CRM, Shell, Amazon Virtual Private Cloud (VPC), Microsoft Power BI, Jupyter, Tableau, Stitch Data, Looker


Oracle, Amazon Web Services (AWS), Amazon EC2, Salesforce, Unix, Windows, Jupyter Notebook


2011 - 2012

Master of Business Administration (MBA) in Project Management

Fundação Getulio Vargas (FGV) - Florianopolis, SC, Brazil

2002 - 2007

Bachelor's Degree in Computer Engineering

Universidade Metodista de São Paulo - Sao Bernardo do Campo, SP, Brazil