Holly Grodsky
Verified Expert in Engineering
Query Optimization Developer
Holly's career began with a bachelor's degree in applied statistics with minors in computer science and music. Since graduating, she has pursued database and business intelligence development. She continues to grow both intellectually and socially by running a few home businesses and organizing and participating in tech and other meetup groups in her area.
Portfolio
Experience
Availability
Preferred Environment
Git, Confluence, PostgreSQL, SQL Server Management Studio (SSMS), MacOS, Windows
The most amazing...
...project I've worked was the rebranding of a startup. I built the data mart and had the chance to train the data scientists and analysts on the new environment.
Work Experience
Freelance Database and BI Consultant
Clients (via Toptal)
- Worked with clients on database and BI reporting needs.
Database and BI Consultant
Das42 Data Consulting Firm
- Worked for various clients including Snap, Kohler, Glamsquad, Craftsy, and others which cannot be listed.
- Developed full Looker models from scratch including multiple dashboards, looks, and explores.
- Cleaned up a Looker instance with 400+ views, ten models, and 400+ explores. Customized the user/group management for more precise permission approval.
- Analyzed queries to help some companies cut over $100,000 in monthly BigQuery and Snowflake costs.
- Wrote scripts for the task automation of data ingestion from sources such as CSV or other database streams.
SQL Data Warehouse Developer
Craftsy (via ProKarma)
- Utilized Data Virtuality ETL for a compilation of multiple data sources including Amazon Redshift, NetSuite, PostgreSQL, Google Analytics (GA), Facebook, and CSV files.
- Determined how to parse JSON for ingestion of GA data.
- Learned proprietary SQL language by Data Virtuality modeled similarly to PostgreSQL.
- Became the resident expert of two new tools (Data Virtuality and DBeaver).
- Worked closely with Data Virtuality support while the new tool continued to grow.
- Built, improved, and maintained the Virtual Data Warehouse following the Kimball method using Dim and fact tables in a star schema style.
- Prepared prior to the launch of the new 2.0 website then configured and updated as new development occured.
- Assisted in the logic construction for the finance and data analytics teams through the Looker BI Tool.
- Wrote documentation for the end-user and for the ease of knowledge transfer using a Confluence Wiki.
- Educated power-users on the new data warehouse structure and access of data through Looker and DBeaver.
Senior Business Intelligence Developer
Level 3 Communications
- Implemented query performance tuning (T-SQL, some PL/SQL and MySQL) and data cleansing/normalization.
- Correlated the data from 20+ data warehouses and inventory systems into a single database.
- Scheduled SSIS packages for ETL from Oracle, Access, MySQL, and flat file sources.
- Worked end-to-end on a ticket-tracking web application—planning, designing, development, and implementation.
- Worked multi-level to track projects, requests, bugs, and ad hoc tasks.
- Completed the build of a MS SQL relational database; keys and indexes. Utilized the functions and stored procedures for automated user group administration from LDAP.
- Created SSRS reports for c-level reporting and aggregation of data for end-user consumption.
- Assisted with the development of an in-house web application dashboard and workflow.
- Learned MVC with C#, HTML5, Razor, Bootstrap, JavaScript, CSS, Lambda, and the Entity framework.
- Implemented the integration of a pre-existing in-house web application.
- Built a secondary MS SQL database; keys and indexes.
- Improved and stabilized a Python web application.
- Converted a MySQL database into an MS SQL database with use of triggers, stored procedures, views, and user-defined functions.
- Converted Python ETL to scheduled SSIS packages.
- Supported regularly the active users and produce additional functionalities as needed.
SQL Software Engineer
Markit on Demand
- Carried out the post-production support of major financial websites.
- Investigated sources of issues (reported data, SQL code/logic, or web code).
- Fixed the logic in T-SQL or escalated web/data issues to the proper teams.
- Monitored and occasionally debugged hundreds of automated JavaScript tasks.
- Utilized internally-built automation tools for testing.
Application Support Engineer
Urban Lending Solutions
- Worked on T-SQL development in an Agile team.
- Introduced the organization to simple SSIS packages.
- Created team standards/automated T-SQL queries and stored procedures.
- Worked with C#/.NET—incorporated Microsoft Excel libraries, LINQ, and AutoHotkeys.
- Wrote the procedural documentation, requirement documents, UI mockups, flow charts, and application and production support.
- Interacted with customers and routed requests.
- Trained all the new app support engineers.
Experience
Access Assurance Web Tool
Project Roles
· Project Manager
· Database Developer
· Report Developer Number 2
· Web Developer Number 3
We had one main web developer whose job was to combine all of our work and integrate it into our already existing tools. The other developer was about 80/20 report developer and web developer.
The project took us about three months to complete from inception through planning, development, QA, and release.
Our end-user base team consisted of about 40 users from data tracking and the report consumption.
What I learned from this project was:
· Requirements gathering and weekly client-update meetings are important.
· Clients tend to change requirements as you go, especially if beforehand they didn't know exactly what they wanted.
· Test-driven development is an important skill.
· Make the project scalable, because sooner or later, you will need to add in something new.
Data Team Project and Time-tracking Tool
There was a total of three developers. We discussed having a better way to track our work, and we started storing the numbers in a preliminary database. From that, the web developer would build the tool to fit into our existing internal C#/.NET web application. I wrote the initial SQL scripts for the team to play around with and worked with our manager on finding what he needed to report on. A third developer and I also worked on developing the SSRS reports
I assigned out tasks for each of us to work on, with time-frames and gave them to the qualified team members. We reported biweekly for progress updates and brainstorming sessions. In the end, the tool was used every day to track our work and projects. Our manager also used it for future project planning, and C-level reporting on a monthly basis.
What I learned from this project was:
· Every developer has their own way to track their work.
· Biweekly stand-ups are not enough, daily is probably better.
· Give the work to the people best suited to the task.
Craftsy Data Mart Build-out
My role was to work with the data scientists and analysts on presenting the data in a Kimball star schema (with fact and dimension modeling) in the new data mart. We used PostgreSQL and a tool called Data Virtuality to have real-time looks into our Google Analytics and live databases.
After creating the new data mart, I then trained users on how to use it and helped analysts in building reports in Looker, a BI tool.
Snapchat Expense Reduction
I helped cut costs by over $150,000 in queries alone and directed them on how to keep their costs down. Also provided best practices documentation, and directed on training going forward.
Things I learned from this project:
· Looker Admin explorers are quite robust.
· I learned how to utilize BigQuery.
· If the only access to the database you have is through a reporting tool (such as Tableau, MS Report Builder, Looker) you can get to what you need—only if it's available—with a lot of round-about solutions. Having access to the database itself is a much better use of the consultant's time.
LAMP Application Integration and ETL Rebuild
By the time I had completed incorporating it into the MS SQL data mart (removed data redundancy as well) and completed recreating the ETL (of the ~20 data sources, only about 5 were unique), I reduced the tables to about 80 including views and trigger-populated activity history tables, and reduced the run-time from almost 10 hours for 50 million rows to just under 3 hours.
Skills
Languages
SQL, T-SQL (Transact-SQL), PL/pgSQL, C#, Snowflake, Python, C#.NET
Tools
Microsoft Excel, Slack, Confluence, Lucidchart, SourceTree, Git, Looker, Google Analytics, Visual SourceSafe, Amazon CloudWatch, AWS IAM, Amazon Virtual Private Cloud (VPC), Trello, Visual Studio 2012, GitHub, BigQuery, Jira
Storage
SQL Stored Procedures, SQL Server Integration Services (SSIS), SQL Server Management Studio (SSMS), SQL Functions, MySQL, PostgreSQL, Database Modeling, DBeaver, Microsoft SQL Server, Amazon S3 (AWS S3), PL/SQL, SQL Server Reporting Services (SSRS)
Other
Music, Data Virtuality, Data Marts, Query Optimization
Platforms
MacOS, Windows, Amazon Web Services (AWS), AWS Lambda, Unix
Frameworks
.NET
Paradigms
Agile, Kimball Methodology
Education
Bachelor of Science Degree in Applied Mathematics with an emphasis in Statistics (minors in Music, Computer Science)
University of Northern Colorado - Greeley, CO, USA
Participated in an Exchange Program in Computer Science and Mathematics
Stony Brook University, New York—SUNY - Stony Brook, NY, USA
Certifications
Certified Looker Consultant
Looker
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.
Start hiring