Holly Grodsky, Query Optimization Developer in Cambridge, MA, United States
Holly Grodsky

Query Optimization Developer in Cambridge, MA, United States

Member since June 18, 2020
Holly's career began with a bachelor's degree in applied statistics with minors in computer science and music. Since graduating, she has pursued database and business intelligence development. She continues to grow both intellectually and socially by running a few home businesses and organizing and participating in tech and other meetup groups in her area.
Holly is now available for hire




Cambridge, MA, United States



Preferred Environment

Git, Confluence, PostgreSQL, SQL Server Management Studio, MacOS, Windows

The most amazing...

...project I've worked was the rebranding of a startup. I built the data mart and had the chance to train the data scientists and analysts on the new environment.


  • Freelance Database and BI Consultant

    2017 - PRESENT
    Clients (via Toptal)
    • Worked with clients on database and BI reporting needs.
    Technologies: PostgreSQL, Looker
  • Database and BI Consultant

    2017 - PRESENT
    Das42 Data Consulting Firm
    • Worked for various clients including Snap, Kohler, Glamsquad, Craftsy, and others which cannot be listed.
    • Developed full Looker models from scratch including multiple dashboards, looks, and explores.
    • Cleaned up a Looker instance with 400+ views, ten models, and 400+ explores. Customized the user/group management for more precise permission approval.
    • Analyzed queries to help some companies cut over $100,000 in monthly BigQuery and Snowflake costs.
    • Wrote scripts for the task automation of data ingestion from sources such as CSV or other database streams.
    Technologies: Amazon Web Services (AWS), Python, Amazon CloudWatch, Amazon Virtual Private Cloud (VPC), AWS Lambda, AWS IAM, Data Virtuality, MySQL, PostgreSQL, Snowflake, BigQuery, Looker
  • SQL Data Warehouse Developer

    2016 - 2016
    Craftsy (via ProKarma)
    • Utilized Data Virtuality ETL for a compilation of multiple data sources including Amazon Redshift, NetSuite, PostgreSQL, Google Analytics (GA), Facebook, and CSV files.
    • Determined how to parse JSON for ingestion of GA data.
    • Learned proprietary SQL language by Data Virtuality modeled similarly to PostgreSQL.
    • Became the resident expert of two new tools (Data Virtuality and DBeaver).
    • Worked closely with Data Virtuality support while the new tool continued to grow.
    • Built, improved, and maintained the Virtual Data Warehouse following the Kimball method using Dim and fact tables in a star schema style.
    • Prepared prior to the launch of the new 2.0 website then configured and updated as new development occured.
    • Assisted in the logic construction for the finance and data analytics teams through the Looker BI Tool.
    • Wrote documentation for the end-user and for the ease of knowledge transfer using a Confluence Wiki.
    • Educated power-users on the new data warehouse structure and access of data through Looker and DBeaver.
    Technologies: BigQuery, Google Analytics, DBeaver, Data Virtuality, PostgreSQL
  • Senior Business Intelligence Developer

    2013 - 2016
    Level 3 Communications
    • Implemented query performance tuning (T-SQL, some PL/SQL and MySQL) and data cleansing/normalization.
    • Correlated the data from 20+ data warehouses and inventory systems into a single database.
    • Scheduled SSIS packages for ETL from Oracle, Access, MySQL, and flat file sources.
    • Worked end-to-end on a ticket-tracking web application—planning, designing, development, and implementation.
    • Worked multi-level to track projects, requests, bugs, and ad hoc tasks.
    • Completed the build of a MS SQL relational database; keys and indexes. Utilized the functions and stored procedures for automated user group administration from LDAP.
    • Created SSRS reports for c-level reporting and aggregation of data for end-user consumption.
    • Assisted with the development of an in-house web application dashboard and workflow.
    • Learned MVC with C#, HTML5, Razor, Bootstrap, JavaScript, CSS, Lambda, and the Entity framework.
    • Implemented the integration of a pre-existing in-house web application.
    • Built a secondary MS SQL database; keys and indexes.
    • Improved and stabilized a Python web application.
    • Converted a MySQL database into an MS SQL database with use of triggers, stored procedures, views, and user-defined functions.
    • Converted Python ETL to scheduled SSIS packages.
    • Supported regularly the active users and produce additional functionalities as needed.
    Technologies: Git, .NET, C#, MySQL, Python, SSRS, SQL Server Integration Services (SSIS), PL/SQL, Microsoft SQL Server
  • SQL Software Engineer

    2013 - 2013
    Markit on Demand
    • Carried out the post-production support of major financial websites.
    • Investigated sources of issues (reported data, SQL code/logic, or web code).
    • Fixed the logic in T-SQL or escalated web/data issues to the proper teams.
    • Monitored and occasionally debugged hundreds of automated JavaScript tasks.
    • Utilized internally-built automation tools for testing.
    Technologies: SQL Server Management Studio, T-SQL, Visual SourceSafe
  • Application Support Engineer

    2010 - 2013
    Urban Lending Solutions
    • Worked on T-SQL development in an Agile team.
    • Introduced the organization to simple SSIS packages.
    • Created team standards/automated T-SQL queries and stored procedures.
    • Worked with C#/.NET—incorporated Microsoft Excel libraries, LINQ, and AutoHotkeys.
    • Wrote the procedural documentation, requirement documents, UI mockups, flow charts, and application and production support.
    • Interacted with customers and routed requests.
    • Trained all the new app support engineers.
    Technologies: Jira, SSRS, SQL Server Integration Services (SSIS), SQL Server Management Studio, T-SQL


  • Access Assurance Web Tool

    Running this group project was my largest accomplishment so far. I managed a team of 3 (myself included) developers to design and create a planning and budgeting tool for our clients.

    Project Roles
    · Project Manager
    · Database Developer
    · Report Developer Number 2
    · Web Developer Number 3

    We had one main web developer whose job was to combine all of our work and integrate it into our already existing tools. The other developer was about 80/20 report developer and web developer.

    The project took us about three months to complete from inception through planning, development, QA, and release.

    Our end-user base team consisted of about 40 users from data tracking and the report consumption.

    What I learned from this project was:
    · Requirements gathering and weekly client-update meetings are important.
    · Clients tend to change requirements as you go, especially if beforehand they didn't know exactly what they wanted.
    · Test-driven development is an important skill.
    · Make the project scalable, because sooner or later, you will need to add in something new.

  • Data Team Project and Time-tracking Tool

    I managed and developed on a team project building a ticket-tracking tool for internal requests including projects, ad hoc database queries, bug fixes, and other uses of time.

    There was a total of three developers. We discussed having a better way to track our work, and we started storing the numbers in a preliminary database. From that, the web developer would build the tool to fit into our existing internal C#/.NET web application. I wrote the initial SQL scripts for the team to play around with and worked with our manager on finding what he needed to report on. A third developer and I also worked on developing the SSRS reports

    I assigned out tasks for each of us to work on, with time-frames and gave them to the qualified team members. We reported biweekly for progress updates and brainstorming sessions. In the end, the tool was used every day to track our work and projects. Our manager also used it for future project planning, and C-level reporting on a monthly basis.

    What I learned from this project was:
    · Every developer has their own way to track their work.
    · Biweekly stand-ups are not enough, daily is probably better.
    · Give the work to the people best suited to the task.

  • Craftsy Data Mart Build-out

    Contracted for 3.5 months to work alongside a major company rebranding and website rebuild.

    My role was to work with the data scientists and analysts on presenting the data in a Kimball star schema (with fact and dimension modeling) in the new data mart. We used PostgreSQL and a tool called Data Virtuality to have real-time looks into our Google Analytics and live databases.

    After creating the new data mart, I then trained users on how to use it and helped analysts in building reports in Looker, a BI tool.

  • Snapchat Expense Reduction

    Took a deep dive into the client's Looker and BigQuery data to analyze spend and usage. Analyzed best practices and consulted with users on maintenance and continuing cost reduction.

    I helped cut costs by over $150,000 in queries alone and directed them on how to keep their costs down. Also provided best practices documentation, and directed on training going forward.

    Things I learned from this project:
    · Looker Admin explorers are quite robust.
    · I learned how to utilize BigQuery.
    · If the only access to the database you have is through a reporting tool (such as Tableau, MS Report Builder, Looker) you can get to what you need—only if it's available—with a lot of round-about solutions. Having access to the database itself is a much better use of the consultant's time.

  • LAMP Application Integration and ETL Rebuild

    I rebuilt the database and ETL process from a Python and MySQL database into an SSIS and MS SQL database. The original application had over 450 tables including unnecessary staging tables.

    By the time I had completed incorporating it into the MS SQL data mart (removed data redundancy as well) and completed recreating the ETL (of the ~20 data sources, only about 5 were unique), I reduced the tables to about 80 including views and trigger-populated activity history tables, and reduced the run-time from almost 10 hours for 50 million rows to just under 3 hours.


  • Languages

    SQL, T-SQL, PL/pgSQL, C#, Snowflake, Python, C#.NET
  • Tools

    Microsoft Excel, Slack, Confluence, Lucidchart, SourceTree, Git, Looker, Google Analytics, Visual SourceSafe, Amazon CloudWatch, AWS IAM, Amazon Virtual Private Cloud (VPC), Trello, SSRS, Visual Studio 2012, GitHub, BigQuery, Jira
  • Storage

    SQL Stored Procedures, SQL Server Integration Services (SSIS), SQL Server Management Studio, SQL Functions, MySQL, PostgreSQL, Database Modeling, DBeaver, Microsoft SQL Server, Amazon S3 (AWS S3), PL/SQL
  • Other

    Music, Data Virtuality, Data Marts, Query Optimization
  • Platforms

    MacOS, Windows, Amazon Web Services (AWS), AWS Lambda, Unix
  • Frameworks

  • Paradigms

    Agile, Kimball Methodology


  • Bachelor of Science Degree in Applied Mathematics with an emphasis in Statistics (minors in Music, Computer Science)
    2006 - 2010
    University of Northern Colorado - Greeley, CO, USA
  • Participated in an Exchange Program in Computer Science and Mathematics
    2008 - 2009
    Stony Brook University, New York—SUNY - Stony Brook, NY, USA


  • Certified Looker Consultant

To view more profiles

Join Toptal
Share it with others