Felipe Bonzanini, Data Warehouse Design Developer in Campinas - State of São Paulo, Brazil
Felipe Bonzanini

Data Warehouse Design Developer in Campinas - State of São Paulo, Brazil

Member since August 2, 2017
For over the past nine years, Felipe's worked as a data specialist. He's administered, integrated, and maintained data for various projects—specializing in business intelligence solutions. He's more than proficient in database development and administration, data modeling, ETL, and data visualization. For most of Felipe's career, he's worked at a well-known MNC company and has the enterprise knowledge to drive scalable data solutions.
Felipe is now available for hire

Portfolio

  • Movile
    Amazon Web Services (AWS), Google Cloud Platform (GCP), AWS, Redshift...
  • Ambev
    Microsoft SQL Server, Talend, Azure, Project Management, SQL
  • AMARO
    Looker, Stitch Data, Pentaho, DbSchema, Talend, Python, SQL, Apache Airflow...

Experience

Location

Campinas - State of São Paulo, Brazil

Availability

Part-time

Preferred Environment

Amazon Web Services (AWS), Looker, Apache Airflow, Talend, AWS, PostgreSQL, Linux

The most amazing...

...project I've built from scratch is a data warehouse that enabled financial projections using machine learning algorithms using all open-source tools.

Employment

  • BI Specialist

    2018 - PRESENT
    Movile
    • Coordinated the team task development using Scrum methodology.
    • Designed new projects for the BI team.
    • Provided insights to the business team based on data analysis.
    • Created dashboards that tells a history through data.
    • Developed and managed all of the company KPIs.
    • Provided weekly and monthly numbers for the accounting report.
    Technologies: Amazon Web Services (AWS), Google Cloud Platform (GCP), AWS, Redshift, Periscope Data, Pentaho, BigQuery, SQL
  • Data Engineer

    2018 - 2018
    Ambev
    • Developed data solutions which included data visualization and data science input datasets for many projects for sales.
    • Mapped data sources required for data science algorithms and worked to bring them to SQL Server DW using Talend and Shell Scripting.
    • Created and maintained many Stored Procedures and Triggers to do data manipulation and transformation inside the database.
    • Maintained all aspects of SQL Server Data Warehouse based on Azure Cloud including costs and performance.
    • Ensured that the relationship with tech suppliers was clear and that the job was being done according to the requirements.
    Technologies: Microsoft SQL Server, Talend, Azure, Project Management, SQL
  • Data Engineer

    2017 - 2018
    AMARO
    • Created data models for the BI team and the business areas; mainly, strong star schema modeling (Kimball).
    • Built and improved the ETL/ELT process using SQL, Talend, Airflow/Python, and Stitch Data.
    • Maintained an up-to-date master data management system.
    • Structured data in flat tables to enable data scientists to analyze the data.
    • Worked with the marketing team to create and automate performance marketing metrics (CAC, CPC, CLV, LTV, and more).
    • Tuned a Redshift data warehouse by tuning distribution and sort keys.
    • Researched and implemented hands-on testing of the new methods and tools to create a highly functional state-of-the-art BI stack.
    Technologies: Looker, Stitch Data, Pentaho, DbSchema, Talend, Python, SQL, Apache Airflow, AWS S3, Redshift
  • ETL Developer

    2016 - 2017
    IBM
    • Developed and maintained robust ETL jobs using IBM DataStage.
    • Maintained DDLs and created data models for new sources.
    • Took care of the weekly data refresh and weekly test cases jobs.
    • Understood client needs and transformed them into technical requests and later real data.
    • Maintained and supported the development of analytical jobs that generate predictions for complex projects data using IBM SPSS.
    • Took care of the server’s architecture and infrastructure as well as cloud servers' provisioning.
    Technologies: D3.js, Talend, SQL, IBM InfoSphere (DataStage)
  • Database Developer

    2014 - 2016
    IBM
    • Developed and maintained a high business impact data warehouse as well as its data model.
    • Optimized the entire data warehouse's architecture and tuned ETL job performance via fine queries and database tuning.
    • Successfully migrated and converted the reporting data warehouse to Netezza.
    • Proposed and implemented solutions to improve system efficiencies and reduce total expenses.
    • Created new complex views to attend to the data pull needs from other projects.
    Technologies: Netezza, Data Modeling, PostgreSQL, Linux, Datastage, Cognos 10, IBM Db2
  • Database Administrator

    2010 - 2014
    IBM
    • Supported more than 500 production databases as L2 DB2 support.
    • Coordinated several version upgrades and server migrations.
    • Automatized servers' security health checks with Shell and SQL scripts—saving 90% of the total task time.
    • Advocated a change in the management expert.
    • Troubleshot very complex problems for large-scale database systems.
    • Supervised a night shift while on call.
    Technologies: Cognos 10, Linux, AIX, IBM Db2

Experience

  • DWH for IBM Maintenance Contracts

    In this project, I assumed the BI architect position to create and maintain a robust data warehouse that would gather data from different sources (sales, contracts, maintenance, customers, ledgers, and more) and combine them to retain data integrity.

    The project took approximately four months to launch the initial version which was already fairly complete and it is ongoing with minimal maintenance needs. As it was completed in an IBM environment, I chose to use DB2 10.1 as the database manager, SQL scripts were managed by IBM Data Server Manager for the ETL, and Cognos 11 was used for the data visualization.

    As this is a private project there is no link to share.

  • Data Visualization for IBM Badges Program
    https://imgur.com/a/tu4nRG6

    Once IBM had launched the IBM Open Badge Program (IBM.com/training/badges), we realized that no current app fit our needs. We needed one that would gather all employee data, match it to the badge data, and compare it to the market.

    This realization led me to start the development of this project on my own, and I was ablnice model and ETL scripts to handle data transformation. I constructed an IBM Data Server Manager to manage these transformation jobs and a MySQL database hosted on Cloud (IBM Bluemix).

    Visualization Project Details (Two Methods):
    1. I, with the help of a web-developer friend, was able to put beautiful charts on a page with a D3.js library (advanced analysis).
    2. For the second way, I used a Watson Analytics tool (web-Tableau-like) to create control dashboards for first-line managers.

    The target audience of this side project consisted of high-level managers so that they could plan their team development as well as assign the right people to the right projects.

    I launched this project within three months of IBM Open Badge program launch, and the timing was just perfect. The upper executive management officially recognized me.

    I've attached a couple of screenshots of the managers' dashboards as this data is public.

  • Performance Marketing Spend Tracking

    In this project, my team and I created a cube for the performance marketing team to control their spending and to develop budgets for the more performant campaigns. I extracted data from many sources like Google Analytics, AppsFlyer, AdWords, Criteo, Bing, Facebook, and combined them with sala data to be able to calculate metrics like Conversion Rate, Churn, Cost per Sale, LTV, CPA, CAC in many dimensions.

    Together with the marketing team, we defined transformation rules for the UTMs in the web, and deep links in mobile, to aggregate it by the group channels (i.e., SEO, retargeting, social media, and so on).

    Most of my work consisted of extensive and interactive collaboration with the marketing team to validate this data, which ended up with several changes to the UTM classification for proper control enablement. I had the opportunity to learn a lot of digital marketing and deeply understand how is the data analysis related to performance marketing.

    Our new product replaced all of the worksheets and manual work of obtaining data from the source; it also sent all this data to an automatized dashboard and a cube to slice and dice the data.

    Unfortunately, I cannot share a link as this is a private project.

  • Decommission, Deployment, and Upgrades of DB2 Databases

    For two years, I was in charge of various types of server management for American Express projects.

    I needed to determine whether to shut down a database due to lack of usage—which would save money—along with if there was a need for new databases for existing/new projects. I was also the focal point of the database migration and upgrades, which used to happen during the weekend maintenance hours.

    In this position, I was responsible for 100+ decommissions, 20+ upgrades, and 200+ database creations. I worked as part of a big database team that handled a significant data center migration.

  • Subscription Model KPIs

    At PlayKids, one of the companies of the Movile group, I had the opportunity to restructure all indicators and KPIs so that high-level management could have a more accurate view of the business.

    I suggested a new methodology that is specific for the subscription business model that creates market KPIs without the noise that you'd have if you used a traditional data model like the ones for eCommerce. We also recalculated the active user base, churn, conversion, and trial success rate along other indicators.

Skills

  • Languages

    SQL, Python, JavaScript
  • Tools

    Looker, IBM InfoSphere (DataStage), Stitch Data, BigQuery, Periscope Data, DbSchema, Informatica PowerCenter, Apache Airflow
  • Paradigms

    Business Intelligence (BI), ETL
  • Storage

    IBM Db2, PostgreSQL, Netezza, Datastage, Redshift, AWS S3, Microsoft SQL Server
  • Other

    Data Warehousing, Data Warehouse Design, Data Modeling, Shell Scripting, Agile Sprints, AWS, Cognos 10
  • Platforms

    Talend, Linux, AIX, Azure, Google Cloud Platform (GCP), Amazon Web Services (AWS), Pentaho
  • Libraries/APIs

    D3.js
  • Industry Expertise

    Project Management

Education

  • Especialization in Business Intelligence (Big Data)
    2016 - 2017
    DeVry Metrocamp - Campinas-SP, Brazil
  • Bachelor's degree in Information Systems
    2009 - 2013
    Pontifícia Universidade Católica - Campinas-SP, Brazil

Certifications

  • DB2 Advanced Database Administrator v10.1
    MAY 2014 - PRESENT
    IBM
  • ITIL Foundations v3
    MAY 2013 - PRESENT
    EXIN

To view more profiles

Join Toptal
Share it with others