- Director of Business IntelligenceRefinery292016 - 2017
Technologies: Machine learning, statistical modelling, PostgreSQL, SQL Server, Python, Scikit-learn, NLTK, D3.js
- Created a natural language processing solution in Python and SQL, that downloads 9 million+ tweets per day from the Twitter API and identifies the trending topics that are relevant to our Google Adwords keywords. Published the trending topics in a D3.js interactive webpage. This helps the writers and editors know what topics to write about.
- Wrote a script in Python and SQL stored procedures, that downloads all of the Facebook posts and tweets from 800+ celebrities every day, and extracts the language elements and shared entities. Published the common entities (themes, memes, hashtags) in a D3.js interactive website. This lets the editors and writers include celebrity content in their articles.
- Created a workflow tracking website for video production. It tracks the videos from initial project budget, through production and post-production, rights and clearances, finalization of the video assets, and publication on social platforms, so we can know what is in the video pipeline, make sure it has proper approvals, and then tie video performance back to budget.
- Architected and managed a SQL Server instance for email analysis. This server collects all of the email data from our email service provider (Sailthru), via their API, and stores it in a data warehouse that I created. I used this data to identify which kind of emailed content each person responds best to, so that we could customize our emails for each person’s best-matching content category.
- Analyzed (via Python Numpy/Pandas/Scikit-Learn) our email recipient behavior to identify clusters of people who respond best to particular kinds of content. We used this in our re-engagement strategy, emailing the less-engaged people only their best-matching content, to spur re-engagement. Our unsubscribe rate dropped by 12% among the less-engaged group, and 6% overall.
- Created a custom interactive data visualization website that presents Refinery29’s most engaging topics as a wordcloud, shows their historical trend lines, and presents recent stories that cover each topic.
- Director of Business IntelligenceTheStreet.com2011 - 2016
Technologies: Machine learning, statistical modelling, SQL Server, Oracle, Python, R, D3.js, Numpy/Pandas/Scikit-Learn, Excel VBA
- Built a set of logistic regression models (one per product) for email targeting that improved email response rates by 18%, while maintaining email volume, by improving the relevance of email messaging. I used R (GLM) for the analysis, and implemented the solution in SQL Server.
- Built regression and clustering models that identify likely fraud. This lets us proactively cancel fraudulent orders, and reduced our charge back rate by 30%.
- Created an automated reporting tool for landing page testing that makes registration flow optimization quick and accurate.
- Built a set of machine learning classification algorithms (using Python Scikit-Learn) that identify the leads with the highest purchase likelihood for upgrade and cross sell.
- Created a website (the “Telesales Dispatcher”) that presents the highest quality leads to our telesales agents each day, based on statistical models of purchase likelihood that I developed.
- Director of Research and AnalyticseMusic, Inc.2007 - 2011
Technologies: Machine learning, statistical modelling, SQL Server, R, Excel VBA,
- Built a suite of automated database reporting applications, using Excel (with VBA) as a client for SQL Server data, providing visibility into all of the marketing data and company key metrics, including signups, web conversion rates, email open and click rates, site usage, churn metrics, and geo-mapping.
- Built a web scraping tool that retrieves song and album prices from Amazon and iTunes, so that we can strategically price our music catalog to best comparative advantage. This tool improved our overall prices by 15%.
- Developed customer churn and upsell models of our customers via multivariate logistic regression, used for enhanced targeting of our member communications and offers. Improved upsell rates by 13%, while improving retention by 8%.
- Director of Direct Marketing and AnalyticsEarthLink, Inc2000 - 2007
Technologies: Machine Learning, statistical modelling, SQL Server, SAS, Excel VBA
- Developed, implemented, and analyzed the direct marketing strategy for EarthLink’s dialup products, including both EarthLink’s flagship dialup internet brand and the PeoplePC value brand. Managed a small team of data scientists, and an overall annual marketing budget of $60 million dollars.
- Optimized marketing spend across direct response TV, solo and shared mail, sponsorships, promotions, and field marketing. Generated more than 850,000 members through my channels in 2007, beating our 2007 plan by 10%.
- Created the “FrontLine Strategizer”: a SQL Server database application (with Excel / VBA client) that builds aggregated monthly forecasts out of campaign-level inputs. Integrated with my real-time direct marketing response projections, this tool enabled me to react quickly to campaign performance and optimize budget allocation.
- Managed a team of marketing managers and data analysts distributed across the San Francisco and Atlanta offices, building their direct marketing skills and helping them deliver subscribers on time and under budget. Also managed a stable of marketing and media buying agencies that assisted with all of our direct marketing efforts.
- Built the “MACalyzer” – a SQL Server database application for direct response TV reporting that reduced the member acquisition costs in the television channel by 18%. This tool ties 400+ dedicated phone numbers to their associated advertising spend, and identifies the resulting orders so that we can track the ROI and member acquisition cost for each airing of our commercial.
- Developed a direct mail reporting and analysis engine in SQL Server with an Excel user-interface. Wrote all the SQL code that loads mail recipients, matches them to mail respondents, and reports response rates via an OLAP cube with more than 20 demographic and marketing dimensions. This reporting and analysis tool was in use for at least seven years after I built it.
- Built the “Churn Toaster”: An OLAP-style SQL Server application providing visibility into the churn rates during individual calendar months, allowing users to explore each month’s churn along a variety of dimensions (acquisition channel or partner, offer, customer tenure, payment method, and voluntary vs involuntary churn).