Arthur Flores Duarte, Developer in Florianópolis - State of Santa Catarina, Brazil
Arthur is available for hire
Hire Arthur

Arthur Flores Duarte

Verified Expert  in Engineering

Software Developer

Location
Florianópolis - State of Santa Catarina, Brazil
Toptal Member Since
June 18, 2020

Arthur has almost 20 years of working in IT companies, focusing on data analytics for the last 10 years, with a solid background in data engineering, data analysis, and BI. Experienced with the main data warehouses in the market, such as Snowflake, BigQuery, and Redshift, he executed projects using dbt in SQL, Python, Airflow, and Fivetran. In addition to fostering excellent relationships with work teams and customers, Arthur is very committed, adaptable, and creative.

Portfolio

Parade
Redshift, Looker, Data Build Tool (dbt), Airtable, Shopify, Fivetran...
Appex Group, Inc.
Snowflake, Data Warehousing, Data Warehouse Design, Data Engineering...
SimplyWise
Amplitude, Redshift, Python, Data Build Tool (dbt), Tableau, Amazon S3 (AWS S3)...

Experience

Availability

Part-time

Preferred Environment

SQL, Amazon Web Services (AWS), Snowflake, Python, Data Build Tool (dbt), Apache Airflow

The most amazing...

...thing I've designed and made: a data architecture to help a company make fast data-driven decisions using Snowflake, dbt, Fivetran, Airflow, Python, and Looker.

Work Experience

Data Analytics Engineer

2023 - PRESENT
Parade
  • Developed and maintained Looker dashboards, models, and schedules.
  • Built and maintained dbt data models using dbt Cloud.
  • Built Looker automation, integrating data with Airtable and Slack.
  • Troubleshot data ingestions from Stitch and Fivetran related to different data sources: Facebook Ads, Google Ads, TikTok Ads, Shopify, Klavyio, Airtable, PostgreSQL, and Google Analytics.
  • Answered data ticket requests from different areas such as revenue, marketing, growth, product inventory, and influencers.
  • Managed and monitored Amazon Redshift cluster data warehouse.
Technologies: Redshift, Looker, Data Build Tool (dbt), Airtable, Shopify, Fivetran, Stitch Data, Dbt Cloud, Klaviyo, Amplitude

Snowflake Data Engineer

2022 - 2023
Appex Group, Inc.
  • Designed and built a new data architecture for the company in collaboration with the new data team, helping automate data ingestion, modeling, and visualization.
  • Implemented dbt (data build tool) for data modeling and trained data analysts on how to use it properly.
  • Set up Fivetran connectors from several data sources to a Snowflake data warehouse.
  • Implemented Airflow DAGs (MWAA) to run and test dbt models and for custom data ingestions using API calls.
  • Built controls for data quality checks in the extracted sources using dbt tests and Airflow.
Technologies: Snowflake, Data Warehousing, Data Warehouse Design, Data Engineering, Data Build Tool (dbt), Apache Airflow, GitHub, Fivetran, Looker, Data Architecture, Data Modeling, Amazon Web Services (AWS), SQL, Python

Data Engineer

2022 - 2022
SimplyWise
  • Designed and built a data architecture to help the company make data-driven decisions.
  • Integrated multiple data sources such as Apple Search, Google Ads, MySQL, and Amplitude into a Redshift data warehouse.
  • Configured data integration services like Fivetran and Stitch to collect different data sources.
  • Developed a Python data pipeline to read and parse JSON files, upload them to Amazon S3, and load them into Redshift tables.
  • Installed and configured the data-building tool to perform data transformation through ELT data modeling.
  • Built dashboards and reports using Tableau Online.
Technologies: Amplitude, Redshift, Python, Data Build Tool (dbt), Tableau, Amazon S3 (AWS S3), Amazon EC2, Fivetran, Stitch Data, SQL, ELT, ETL, Data Pipelines, Data Engineering, Data Architecture

Analytics Engineer

2020 - 2022
205 Data Lab
  • Worked 100% remotely for a US-based company, providing services for San Francisco Bay data customers as an analytics engineer.
  • Created automated custom data reports using Python and Excel VBA scripts.
  • Extracted data for reports from Presto DB and Snowflake using complex SQL scripts.
  • Transformed and modeled raw data through ELT processes using data-building tools.
  • Developed Python scripts for Prefect Cloud to automate and orchestrate data reports.
  • Integrated data between Snowflake and Salesforce to generate reports using Bulk API.
Technologies: Excel VBA, Data Build Tool (dbt), Prefect, Python, Snowflake, Salesforce, Tray.io, SQL, Data Modeling, BI Reports

Data Engineer

2020 - 2020
Projeto 22
  • Worked as a part-time data engineer, supporting the data squad on a new data lake project for a vehicle sales company WebMotors.
  • Supported the data engineering team providing AWS resources: EC2, S3, VPC, RDS using Aurora PostgreSQL, and Redshift.
  • Developed Cloud Formation templates to automate AWS resources provisioning.
  • Deployed an Apache NIFI cluster for data ingestion processing using Zookeeper and NiFi Registry.
  • Integrated AWS DMS (data migration service) mapping tables from SQL Server and Aurora MySQL to S3 buckets in parquet files.
  • Employed an Apache Airflow and EMR cluster to support complex data processing.
  • Built CloudWatch alarms for servers and applications monitoring.
  • Used Lambda and Boto3 to automatically stop/start EC2 and RDS instances according to schedule, reducing costs.
  • Developed Python scripts for several purposes, such as database stress tests.
Technologies: Amazon Web Services (AWS), Apache Airflow, PostgreSQL, Relational Database Services (RDS), Amazon Virtual Private Cloud (VPC), Amazon S3 (AWS S3), Amazon EC2

Data Analyst

2019 - 2020
Spin
  • Analyzed data using complex SQL queries on Google BigQuery.
  • Presented data analytics reports for managers using dashboards from Mode Analytics.
  • Defined business metrics to support and determine company OKRs.
  • Provided monitoring and decision support reports for different areas such as growth, product, and operations.
  • Worked on geospatial analysis for scooters and generated map charts using Python and Jupyter Notebook.
  • Supported A/B tests to analyze the adoption of product features.
Technologies: Dashboards, Amplitude, Jupyter Notebook, Python, Mode Analytics, BigQuery, SQL

Solution Integration Architect

2018 - 2019
Optiva Inc.
  • Worked 100% remotely as part of a global professional services team (English and Spanish speaking teams).
  • Integrated and configured DCRM and portals for telecom customers in different countries.
  • Wrote system integration tests, user acceptance tests, test scenarios, configuration handbooks, user training, and production rollout documents.
  • Analyzed data from different CRM environments, exporting and comparing data into Excel using formulas such as VLOOKUP to troubleshoot missing configuration parameters.
Technologies: Microsoft Dynamics CRM, Microsoft SQL Server

Data Analyst | Technical Leader

2011 - 2018
Wedo Technologies
  • Worked with large volumes of data. Found revenue leakages and trends between telecom systems, applying data analytics techniques.
  • Integrated several telecom systems into the RAID ETL tool, reading different data sources(Oracle, SQL Server, Excel, CSV, ASN1), containing telecom events and customer data.
  • Performed Oracle DB, SQL performance tuning, procedures using PL/SQL, and wrote Python and Shell scripts.
  • Provided system analysis and solution design documentation, scope and architecture definition, and technical proposals for sales.
  • Designed user reports, dashboards, and KPIs to support data-driven decisions and identify revenue leakages.
  • Oversaw unit tests, integration tests, UAT, and production rollout. Completed technical and functional training for customers and teams.
  • Provided technical leadership and project management in different projects.
  • Worked on various projects for telecommunications customers in different countries, such as Brazil, Chile, and Peru.
Technologies: PL/SQL, Business Intelligence (BI), ETL, Python, Shell, Unix, Microsoft SQL Server, Oracle

Developer

2009 - 2011
Seventh
  • Contributed to Delphi programming (MVC, object-oriented) for video surveillance software.
  • Integrated different kinds of devices such as IP cameras and video encoders.
  • Led network protocol integration using CGI, SOAP, HTTP, TCP, and RTSP.
  • Reverse-engineered protocols using Wireshark. Decoded video and audio using VLC libraries.
Technologies: Network Protocols, TCP/IP, Delphi

Trainee | System Analyst

2004 - 2007
Alliance Consutoria
  • Developed software with Uniface language—Compuware.
  • Integrated databases including Oracle, SQL Server, and DB2.
  • Contributed to development using Agile methodologies.
  • Participated in level two CMMI project implementation.
Technologies: Microsoft SQL Server, Oracle

Data Analyst for a Toptal Client

Designed a geospatial dashboard (map) for a vehicle sharing company, displaying the user adoption and behavior change related to defined parking places.

The data was extracted made using SQL and using Google BigQuery geospatial libs. The dashboard was made using Mode, Python, and Jupyter Notebook.

Data Engineer for Fin Tech Company

Designed and built a data architecture to help the company make data-driven decisions. I integrated multiple data sources such as Apple Search, Google Ads, MySQL, and Amplitude into an AWS Redshift data warehouse.

Configured data integration services like Fivetran and Stitch to collect different data sources and developed a data pipeline using Python to read and parse JSON files, upload them to AWS S3 and load them into Redshift tables. I installed and configured the DBT (data building tool) to perform data
transformation (ELT data modeling) and designed the dashboards and reports using Tableau Online.

New Data Architecture

I supported the company in building a new data architecture in two stages for data ingestion automation, storing all the data in one data warehouse and supporting the data analysis team to focus on the business intelligence side.

In the first stage, we made the data available for data analysts as soon as possible using Fivetran and storing it in Snowflake.

In the second stage, we built custom connectors using Airflow and implemented more organized data modeling with dbt and SQL.

Languages

Snowflake, SQL, Python, Delphi, Excel VBA

Paradigms

Business Intelligence (BI), Dimensional Modeling, ETL, Database Design

Storage

Oracle SQL, PL/SQL, Databases, Relational Databases, ANSI SQL, Oracle RDBMS, Amazon S3 (AWS S3), Microsoft SQL Server, Oracle PL/SQL, Redshift, JSON, Data Pipelines, SQL Server 2012, MySQL, PostgreSQL

Other

Data Engineering, Data Build Tool (dbt), Dashboards, Data Visualization, Data Analysis, Analytics, Visualization, Data Modeling, Fivetran, Telecom Business Support Systems (BSS), Revenue Assurance, Key Performance Indicators (KPIs), BI Reports, Data Analytics, Google BigQuery, Mode Analytics, ELT, Data Architecture, Business Intelligence (BI) Platforms, ETL Tools, Data Warehousing, Data Warehouse Design, Performance Tuning, TCP/IP, Network Protocols, Amplitude, Relational Database Services (RDS), Prefect, Shell Scripting, Dynamics CRM 365, IT Project Management, Software Engineering, Query Optimization, Airtable, Dbt Cloud

Libraries/APIs

Tray.io, Pandas, NumPy

Tools

Apache Airflow, BigQuery, Microsoft Excel, Looker, GitHub, Microsoft Dynamics CRM, Shell, Amazon Virtual Private Cloud (VPC), Microsoft Power BI, Jupyter, Tableau, Stitch Data

Platforms

Oracle, Amazon Web Services (AWS), Amazon EC2, Salesforce, Unix, Windows, Jupyter Notebook, Shopify, Klaviyo

2011 - 2012

Master of Business Administration (MBA) in Project Management

Fundação Getulio Vargas (FGV) - Florianopolis, SC, Brazil

2002 - 2007

Bachelor's Degree in Computer Engineering

Universidade Metodista de São Paulo - Sao Bernardo do Campo, SP, Brazil

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring