Neil Schwalb, Software Developer in Atlanta, GA, United States
Neil Schwalb

Software Developer in Atlanta, GA, United States

Member since June 17, 2021
Neil is an experienced data engineer focused on helping businesses make the most of their data by making informed decisions. He specializes in ETL processes, especially the transformation step that helps others focus on what matters in their work.
Neil is now available for hire


  • Mailchimp
    Apache Airflow, Cloud Dataflow, Google BigQuery, Google Analytics...
  • Mailchimp
    Python, R, Google Data Studio, PostgreSQL, Apache Airflow, Google Analytics...
  • Accenture
    R, SQL, Agile, Microsoft Power BI, Data Visualization, CSV, Business Services



Atlanta, GA, United States



Preferred Environment

Apache Airflow, Python, SQL, Google Cloud Platform (GCP), Linux

The most amazing...

...thing I've developed is a custom data pipeline that transforms 50+ TB of data into analytics-first tables that drive company wide reporting and analysis.


  • Senior Data Software Engineer

    2019 - PRESENT
    • Designed and built in-house BI transformation pipelines in Airflow and Google Dataflow to surface analytics-focused tables in BigQuery to our analysts, data scientists, and strategy teams to derive insights from.
    • Led and managed the implementation of Looker as our BI platform and continue to lead Looker engineering and modeling work, helping drive data-informed decision making.
    • Developed and implemented internal data security models and tools with Terraform, GSuite, GCP, and Looker APIs.
    • Co-led data governance efforts to centralize analytics output and content creation for senior leadership.
    Technologies: Apache Airflow, Cloud Dataflow, Google BigQuery, Google Analytics, Google Cloud Platform (GCP), Google Data Studio, Looker, Dataform, Node.js, Business Intelligence (BI), Data Analysis, ETL, SQL, Python, Reporting, BI Reporting, Tableau, DB, Data Engineering, Data Visualization, CSV, Paid Advertising
  • Data Researcher

    2017 - 2019
    • Managed and executed on short and long-term, data-driven research projects that inform company direction and application development.
    • Built and maintained data pipelines using Apache Airflow and Beam to consolidate structured and unstructured data from self-hosted SQL and Elasticsearch instances into BigQuery for analysis and warehousing, as well as data cleansing and curation.
    • Used statistical algorithms such as linear and logistic regression and decision trees to extract correlations from large datasets and drive business strategy.
    • Embedded on multiple product teams to establish and maintain KPIs, forecast use of and perform A/B tests on new products, and prioritize work based on perceived impact.
    • Designed, executed, and ran A/B content experiments that led to a 2% increase in site engagement and a 9% increase in campaign creation.
    • Managed company-wide KPIs and reporting used to drive strategic direction using a custom-built website.
    Technologies: Python, R, Google Data Studio, PostgreSQL, Apache Airflow, Google Analytics, Elasticsearch, BigQuery, Apache Beam, SQL, Data Analysis, Looker, Reporting, Data Visualization, CSV, Paid Advertising
  • Senior Digital Analyst

    2015 - 2017
    • Led the development of big data analytics projects to provide clients with unique insights into operating efficiencies and delivered value.
    • Manipulated and aggregated client data with tools such as Hive and R to feed into big data analytics platforms and BI dashboards.
    • Led both offshore and onshore development teams to deliver visualization dashboards and mobile applications.
    Technologies: R, SQL, Agile, Microsoft Power BI, Data Visualization, CSV, Business Services


  • SQL Scheduler and Dependency Manager

    Designed an analyst-friendly wrapper around Apache Airflow that enables non-engineers to upload queries to a repository and have them automatically run on a set schedule. Dependencies between queries in their repository, along with other core tables are programmatically parsed out and set in Airflow to ensure tables are refreshed with the most up-to-date data. This has enabled many analysts and marketers around Mailchimp to build custom reports and tables that are automatically kept in sync with our main ETL jobs without requiring dedicated engineering resources.

  • BI Data Transformation Pipeline

    A bespoke data transformation pipeline that generates raw application, log, and other third-party data into analytics and reporting-focused data sets. It processes over 50 TB of data every day and produces 30+ different tables. The transformed data provides governed, cleaned, and curated data that provides analysts and data scientists accurate and compliant data. The data is used across all company dashboards and machine learning models. It was build using Airflow, a custom YAML template, and BigQuery.


  • Languages

    SQL, Python, R, YAML
  • Tools

    Looker, BigQuery, Apache Airflow, Tableau, Google Analytics, Cloud Dataflow, Apache Beam, Microsoft Power BI
  • Paradigms

    Business Intelligence (BI), Agile, ETL, MEAN Stack
  • Platforms

    Linux, Google Cloud Platform (GCP)
  • Other

    Google BigQuery, BI Reporting, Data Visualization, CSV, Google Data Studio, Linear Regression, Data Analysis, Reporting, Data Engineering, Dataform, Machine Learning, Data Architecture, Paid Advertising, Business Services
  • Storage

    PostgreSQL, MySQL, DB, Elasticsearch, Google Cloud
  • Frameworks

  • Libraries/APIs



  • Post Baccalaureate Degree in Computer Science
    2016 - 2017
    University of Florida - Gainesville, FL
  • Bachelor's Degree in Biological Engineering
    2010 - 2015
    University of Florida - Gainesville, FL


  • MicroMasters in Statistics and Data Science
    MITx on edX

To view more profiles

Join Toptal
Share it with others