Daphne Liu, Developer in Jacksonville, FL, United States
Daphne is available for hire
Hire Daphne

Daphne Liu

Verified Expert  in Engineering

Bio

Daphne is a highly motivated big data analytic architect and SQL/Tableau developer with strong business analytic solution delivery skills and 20 years of progressively responsible OLTP/OLAP database development/architecture experience. She is a frequent seminar speaker and workshop trainer in business intelligence and analytic solutions. Daphne is experienced collaborating with business users in data modeling and business analytic solutions.

Portfolio

CEVA Logistics
Neural Networks, Performance Tuning, Time Series Analysis, Time Series...
City of Jacksonville
MDX, Microsoft SQL Server, SQL Server Integration Services (SSIS), SSAS...
Crowley Marinetime
Subversion (SVN), SQL Server Reporting Services (SSRS), T-SQL (Transact-SQL)...

Experience

Availability

Part-time

Preferred Environment

Amazon Web Services (AWS), Azure, Google Cloud, Big Data, Linux, SQL

The most amazing...

...thing about me is that I am a data prodigy. I am an expert in SQL development, data modeling, data warehouse development, data analytics, and visualization.

Work Experience

Big Data ML AI Architect

2014 - PRESENT
CEVA Logistics
  • Created a dimensional data model on MS SQL Server for supply chain analytics. It includes data preparation and data labeling, model features' selection, model algorithms, and hyperparameter optimization.
  • Designed an enterprise big data analytic platform solution using Pentaho PDI, Cassandra, Elasticsearch, and Grafana. Grafana is a big data visualization tool.
  • Provided data engineering tasks that refresh data to cloud storage using MS SQL Server, relational OLTP to OLAP transformation, and ETL tasks from MS SQL to NoSQL data lakes in Cassandra.
  • Implemented both Tableau and Power BI analytic visualization for supply chain management and a freight management system.
  • Implemented and completed data modeling of an enterprise data warehouse, data lakes, and ML/AI forecast models.
  • Used Facebook Prophet, a time series algorithms, AutoKeras classification, and Google TensorFlow to deliver an ML and AI solution to a logistics ground TMS system.
  • Delivered dashboards using both Tableau and Power BI for different clients/projects.
  • Delivered a data quality solution using PostgreSQL fuzzy string matching and Python FuzzyWuzzy libraries, cleaning data, and creating mapping groups for the machine learning model.
  • Designed and architected a supply chain carrier advisor ML solution that includes data labeling, features' selection, hyperparameter optimization, algorithms training, and carrier selection smart choices advisory to the supply chain management team.
  • Deployed the supply chain carrier advisor ML model based on TensorFlow TF-Ranking, AutoKeras, and Neural Network algorithms. Over a million records were trained in this model, providing a training result API and batch forecast result references.
Technologies: Neural Networks, Performance Tuning, Time Series Analysis, Time Series, AutoKeras, Pandas, Python, Feature Selection, Data, Machine Learning, Data Architecture, OLAP, NoSQL, SQL, PostgreSQL, Microsoft SQL Server, Tableau, Hortonworks Data Platform (HDP), Grafana, Elasticsearch, Cassandra, Pentaho, Data Analysis, Big Data, Snowflake, Microsoft Excel, Amazon S3 (AWS S3), Database Design, Database Schema Design, Business Intelligence (BI), Integration, Amazon QuickSight

BI Architect

2013 - 2015
City of Jacksonville
  • Architected Microsoft Business Intelligence Solutions using SQL Server, SSIS, and SSAS.
  • Built SSAS Cube and MDX.
  • Developed SSIS and designed a data warehouse.
  • Designed and developed a Microsoft Power BI Solution.
Technologies: MDX, Microsoft SQL Server, SQL Server Integration Services (SSIS), SSAS, Microsoft Power BI, Microsoft Excel, Amazon S3 (AWS S3), Database Design, Database Schema Design, Reporting, Business Intelligence (BI), Integration

BI Architect

2012 - 2013
Crowley Marinetime
  • Built a Microsoft Business Intelligence Solution SSAS Cube for budget and actual.
  • Implemented an SSIS ETL from Oracle and DB2.
  • Created a TSQL for Crowley Vessel Captain log dimensional data model.
  • Developed an SSRS report.
  • Implemented SVN source version control.
Technologies: Subversion (SVN), SQL Server Reporting Services (SSRS), T-SQL (Transact-SQL), IBM Db2, Oracle, SQL Server Integration Services (SSIS), SSAS, Microsoft Power BI, Microsoft Excel, Database Design, Database Schema Design, Reporting, Business Intelligence (BI), Integration

Tableau Dashboard Development

https://public.tableau.com/profile/daphne.liu#!/
Tableau dashboard design for a supply chain carrier KPI, financial management KPI (AP vs AR), and shipment on time performance. Implemented Tableau actions, KPI calculated columns, LOD calculations, dynamic slicers, and performance tuning.

Big Data Cassandra & Solr Document Search

Solr cloud free text search engine design for vendor EDI documents using Solr data import module with Cassandra cluster data stored. A Hadoop HDFS file system was implemented for Solr document index storage. Six Solr collections with shards and replicas. Deployed in March 2016

Big Data Cassandra & Elasticsearch Data Warehouse

Big Data NoSQL Cassandra and Elasticsearch cluster solution design and implementation. Elasticsearch search engine was built on top of Cassandra cluster, Using Pentaho PDI ETL tool moving data from relational databases to Cassandra NoSQL clusters for enterprise data warehouse. Started in 2017 and deployed in July 2018.

Dimensional Data Model for Supply Chain Management and Financial Management

SCM and FM Dimensional models built on top of current SQL server data store. These models provide internal or external customers data sources for business analytics. The solution was developed in TSQL, SSIS, and SQL server 2016.

Supply Chain Carrier Advisor — Machine Learning Model

Carrier Advisor is a machine learning project that advises carriers for operators in a supply chain management system.
I built the AI and ML model from OLAP by labeling data, selecting features and algorithms, POC using AutoML algorithms, and performed the final production deployment using AutoKeras and TensorFlow TF-Ranking. Data was transformed from OLAP to prediction models using Python and Pentaho PDI.
1993 - 1995

Master's Degree in Computer Information Science & Engineering

University of Florida - Florida

MARCH 2014 - MARCH 2016

Tableau

Tableau

Libraries/APIs

Pandas, AutoKeras, TensorFlow Deep Learning Library (TFLearn)

Tools

AutoML, Tableau, Grafana, Pentaho Data Integration (Kettle), Microsoft Excel, Amazon QuickSight, H2O AutoML, Apache Solr, ARIMA, Prophet ERP, Solr, Superset, Microsoft Power BI, SSAS, Subversion (SVN)

Languages

Python, T-SQL (Transact-SQL), SQL, Snowflake, Python 3, MDX

Paradigms

OLAP, Database Design, Business Intelligence (BI)

Platforms

Dataiku, Linux, Amazon EC2, Azure, SolrCloud, Apache Kafka, Pentaho, Hortonworks Data Platform (HDP), Oracle, Amazon Web Services (AWS)

Storage

Microsoft SQL Server, OLTP, NoSQL, Elasticsearch, Amazon S3 (AWS S3), Redshift, Google Cloud, Cassandra, Druid.io, SQL Server Integration Services (SSIS), IBM Db2, SQL Server Reporting Services (SSRS), PostgreSQL

Frameworks

Hadoop, AWS HA

Other

Data Analysis, Apache Cassandra, Big Data Architecture, Data Virtualization, Data Warehouse Design, Data Modeling, Data Architecture, Big Data, Forecasting, Time Series, AWS Database Migration Service (DMS), Database Schema Design, Integration, Data Engineering, Informatica, Artificial Intelligence (AI), Classification Algorithms, Data Science, Machine Learning, Neural Networks, Agile Data Science, Linear Regression, Logistic Regression, Reporting, Feature Selection, Performance Tuning, Classification, Data, Time Series Analysis

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring