Scroll To View More
Sagar Sharma

Sagar Sharma

Toronto, ON, Canada
Member since July 11, 2016
Sagar is a seasoned data professional with more than seven years of work experience with relational databases and three years with big data—specializing in designing and scaling data systems and processes. He is a hardworking individual with a constant desire to learn new things and to make a positive impact in the organization. Sagar possesses excellent communication skills and is a motivated team player with the ability to work independently.
Sagar is now available for hire
Portfolio
Experience
  • SQL, 10 years
  • Databases, 10 years
  • Business Intelligence (BI), 8 years
  • Python, 7 years
  • SQL Server 2012, 7 years
  • MySQL, 6 years
  • Pentaho Data Integration (Kettle), 6 years
  • Hadoop, 5 years
Toronto, ON, Canada
Availability
Part-time
Preferred Environment
macOS/Linux, Sublime Text, IntelliJ IDEA
The most amazing...
...project was building a data lake with Hadoop—this included a Hadoop ecosystem installation from scratch and building data pipelines to move data into the lake.
Employment
  • Data Engineer
    2017 - PRESENT
    Colorescience
    • Designed, developed, and am now maintaining a reporting data warehouse built using PostgreSQL (AWS RDS).
    • Created an ETL framework using Python and PDI to load data into a data warehouse.
    • Connected to third-party APIs to import data on an incremental basis e.g., Salesforce, Sailthru, and CrowdTwist.
    • Managed PostgreSQL (RDS) and ETL servers (EC2) using AWS Console.
    • Created reports and dashboards in Looker to provide insights on the data.
    Technologies: Python, Amazon RDS for PostgreSQL, MySQL, Salesforce, MySQL
  • Senior Business Intelligence Engineer
    2016 - 2018
    Altus Group Limited
    • Built a reporting data warehouse using Pentaho, PostgreSQL, and Informatica.
    • Designed a database schema in PostgreSQL to represent the reporting use case.
    • Created ETL tasks in Informatica to move data from the production systems into PostgreSQL.
    • Built reports and dashboards using a Pentaho report designer and deployed them to the BI server.
    Technologies: PostgreSQL, MS SQL, Pentaho Reporting, Informatica
  • Data Engineer
    2014 - 2016
    Wave Accounting, Inc.
    • Designed, developed, and maintained big data and business intelligence solutions at Wave.
    • Designed and scheduled complex ETL workflows and jobs using Pentaho Data Integration (Kettle) to load data into the data systems.
    • Wrote custom Python scripts to access third party APIs and download data into the data systems.
    • Developed complex SQL queries including JOINS, subqueries, and common table expressions to address ad hoc business analytics and other requirements.
    • Coordinated with the product and executive teams to gather and understand business requirements.
    • Built an end-to-end relational data warehouse—including infrastructure, schema design, optimization, and administration.
    • Designed and developed a Hadoop Cluster using Horton Works HDP 2.0. Tasks include installing and configuring a Hadoop ecosystem and designing the HDFS.
    • Designed and scheduled Sqoop jobs to load data into the HDFS from the production systems.
    Technologies: Python, MS SQL, Pentaho Kettle, Sisense, PostgreSQL, Hadoop, MySQL, Hive, Sqoop, Ansible
  • Business Intelligence Developer
    2011 - 2014
    Eyereturn Marketing, Inc.
    • Designed real-time reporting solutions using a SQL server (SSIS, SSAS and SSRS) and Pentaho business intelligence tools (MySQL, Mondrian, and Pentaho).
    • Created custom automated/scheduled reports using Eclipse BIRT and Pentaho Report Designer.
    • Built custom ETL tasks to transform data for custom reports using Kettle (Pentaho Data Integration).
    • Designed and optimized database schemas to make reporting faster and efficient.
    • Created, maintained, and scheduled custom data processors to pull and manipulate data from HDFS using Pig, Sqoop, and Oozie (Cloudera Hadoop).
    Technologies: MS SQL, SSIS, SSAS, Pentaho, Hive, Hadoop, Pig, Sqoop, MySQL
  • Database Analyst
    2010 - 2011
    George Brown College
    • Handled and was responsible for the database administration in the organization using Blackbaud’s Raiser’s Edge.
    • Updated and maintained the alumni database using the MS SQL Server.
    • Conducted data validation and verification to ensure the accuracy and quality of the data.
    • Performed multiple queries at a complex level for the purposes of reports and provide information for divisional and marketing purposes.
    • Provided support to the project managers.
    Technologies: MS SQL Server, Raiser's Edge
  • Software Engineer
    2007 - 2009
    Tata Consultancy Services
    • Provided post-implementation support and training for an enterprise level banking application (TCS B@ncs) to 25,000+ corporate end-users.
    • Handled different modules of the banking operations such as routine banking, loans and mortgages, capital markets, and foreign exchange.
    • Analyzed client business needs and translated them into functional/operational requirements.
    • Communicated successfully with a variety of people including subject matter experts to establish a technical vision, business units, development teams, and support teams.
    Technologies: Java, HTML, SQL, Oracle
Experience
  • Data Lake Using Hadoop (Hortonworks HDP 2.0) (Development)

    I built a data lake using Hadoop.
    My tasks included:
    - Installing and configuring Hadoop ecosystem components on the RackSpace Cloud Big Data platform.
    - Automating the above process using Ansible.
    - Designing and scheduling Sqoop jobs to import data from MySQL and PostgreSQL production tables into HDFS.
    - Setting up Hive tables from HDFS files to enable SQL-like querying on HDFS data.

  • Sisense Rebuild Project (Development)

    I built a data warehouse for a global animal health company. The goal was to help business users by giving them a single source of truth about their mobile application.

    The project included the following tasks:
    1) Redesigning the data model in Sisense ElastiCube. I used a snowflake schema to establish database relationships.
    2) Creating a build schedule of the above created data model.
    3) Creating dashboards, based on the client's requirements.
    4) Setting-up email schedules for the dashboards to relevant stakeholders.

Skills
  • Languages
    Python, SQL, CSS, HTML, Transact-SQL, JavaScript, Java, SAS, Scala
  • Tools
    Pentaho Data Integration (Kettle), Looker, Informatica ETL, Pentaho Mondrian OLAP Engine, Sisense, Apache Sqoop
  • Paradigms
    ETL Implementation & Design, ETL, MapReduce
  • Storage
    Databases, SQL Server 2008, SQL Server 2014, SQL Server 2012, MySQL, AWS RDS, SQL Server Analysis Services (SSAS), Apache Hive, PostgreSQL, SQL Server Integration Services (SSIS), HDFS, MongoDB
  • Other
    ETL Tools, Pentaho Reports, ETL Development, Business Intelligence (BI), RESTful APIs, Semantic UI, Data Analysis, Pentaho Dashboard, Informatica
  • Frameworks
    Express.js, Foundation CSS, Materialize CSS, Bootstrap 3+, Flask, Hadoop
  • Libraries/APIs
    Node.js, NumPy, Pandas, REST API, React, DreamFactory, Passport.js
  • Platforms
    AWS EC2, Windows, Mac OS, Linux, Apache Pig
Education
  • Certificate in SAS Certified Base Programmer
    2011 - 2011
    SAS Institute Canada - Toronto, Canada
  • Postgraduate certificate in Strategic Relationship Marketing
    2010 - 2010
    George Brown College - Toronto, Canada
  • Bachelor's degree in Mechanical Engineering
    2003 - 2007
    YMCA University of Science and Technology - Faridabad, India
I really like this profile
Share it with others