Bikas Anand, Developer in Jersey City, NJ, United States
Bikas is available for hire
Hire Bikas

Bikas Anand

Verified Expert  in Engineering

Data Engineering Developer

Location
Jersey City, NJ, United States
Toptal Member Since
June 18, 2020

Bikas has over 20 years of data engineering, data analysis, and architecture experience. He has worked on several large DW and analytics implementations on-premises/cloud for GE, Union Bank, NBC, First Hawaiian Bank & Bayer, which included new implementations and modernizations while supporting legacies. He is proficient in SQL, databases such as Snowflake, Redshift, Postgres, and Teradata, Data Engineering Tools such as Informatica, Matillion, and DBT, and scripting languages such as Python.

Portfolio

SIG - Data Technologies Team
Oracle, Oracle PL/SQL, PySpark, Apache Hive
Kalderos Inc
Snowflake, ETL, Data Engineering, Python, Data Build Tool (dbt), Fivetran...
Pfizer - Pfizer Cloud
Data Warehouse Design, ETL, Informatica, Data Warehousing, SQL, Snowflake

Experience

Availability

Full-time

Preferred Environment

Databases, Data Engineering, Data Analysis, SQL

The most amazing...

...work I've done was for Sochi Olympics, reporting for NBC Universal and fleet data management for GE Aviation.

Work Experience

Oracle Developer

2023 - 2023
SIG - Data Technologies Team
  • Developed and delivered several integrations from various source files to Oracle using PL/SQL and Oracle to Hive using PySpark for traders to consume data and make decisions.
  • Created a POC to use the Airflow scheduler instead of the existing Tidal Job scheduler.
  • Developed a generic script to export data from Oracle to Hive using PySpark based on parameter values passed.
Technologies: Oracle, Oracle PL/SQL, PySpark, Apache Hive

Data Engineer

2022 - 2022
Kalderos Inc
  • Worked on building new integrations using Data Build Tool (dbt) and Snowflake for the reporting.
  • Implemented several optimizations in the As-is process and improved data accuracy with data quality check implementations.
  • Designed a new model for their customer reporting.
Technologies: Snowflake, ETL, Data Engineering, Python, Data Build Tool (dbt), Fivetran, ETL Implementation & Design

Data Modeler

2019 - 2021
Pfizer - Pfizer Cloud
  • Worked on redesign of their existing data warehouse on AWS Cloud.
  • Helped the data engineering team during new integration development.
  • Designed and developed data model and integrations for DQ reporting solution for Pfizer.
Technologies: Data Warehouse Design, ETL, Informatica, Data Warehousing, SQL, Snowflake

Data Warehouse Technology Architect

2017 - 2019
Bayer
  • Led the team for impact analysis and business coordination. Oversaw the ABBI support team for any technical issue/enhancements, working with Bayer stakeholders and maintaining an excellent relationship with key members from business as well as IT.
  • Worked closely with Bayer's project architecture team, helping define the data flow and overall architecture for their data warehouse modernization.
  • Implemented data discovery by working with the business and other stakeholders before implementing the data warehouse on a modernized platform.
  • Played a vital role in the modernization program's As-is and To-be data analysis.
Technologies: Python, Trifacta, Talend, Apache Impala, Apache Hive, HDFS, Big Data, Cloudera, Redshift, Amazon S3 (AWS S3), Data Warehousing, Data Warehouse Design, Data Engineering, Google BigQuery, Business Intelligence (BI), Data Modeling, SQL, Snowflake, Data Build Tool (dbt), ETL

Enterprise Data Warehouse Architect

2015 - 2017
First Hawaiian Bank
  • Defined initial roadmap and architecture process flow.
  • Designed and developed credit analytics for the customer-facing 360 view for the bank.
  • Designed and developed customer portfolio reporting.
  • Defined ETL strategy and planning for different reporting requirements for the bank.
  • Led the enterprise data warehouse (EDW) team to implement solutions and mentored the team members.
Technologies: AIX, Oracle, Informatica, Data Warehousing, Data Warehouse Design, Data Engineering, Data Modeling, SQL, ETL

Business Intelligence and Analytics Architect

2013 - 2015
NBC Universal
  • Led the team to integrate different systems into a big data (Pivotal HD) environment and sourced data from HDFS to the data warehouse for business intelligence (BI) reporting.
  • Led the Sochi Olympics reporting implementation and support.
  • Integrated the new source data, Operative One (sales system), DFP Premium (traffic system for displaying ads), and Freewheel (traffic system for video ads) for single view reporting.
Technologies: Tableau, Big Data, Apache Hive, Hadoop, Informatica, Teradata, Data Warehousing, Data Warehouse Design, Data Engineering, Business Intelligence (BI), Data Modeling, SQL, ETL

ETL Architect

2011 - 2013
Union Bank
  • Analyzed requirements and performed impact analysis.
  • Designed databases and developed new integrations to support business requirements.
  • Reported project status weekly to the Union Bank leadership team.
Technologies: Informatica, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Oracle9i, Data Warehousing, Data Warehouse Design, Data Engineering, Data Modeling, SQL, ETL

EDW Architect

2010 - 2011
GE Healthcare
  • Built business user interactions.
  • Conducted requirements and impact analyses.
  • Designed databases and ETL.
  • Documented processes and reviewed code.
Technologies: SAP BusinessObjects (BO), Cognos ReportNet, Teradata, Solaris, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Informatica, Data Warehousing, Data Warehouse Design, Data Engineering, Business Intelligence (BI), Data Modeling, SQL, ETL

Technical Project Manager

2009 - 2010
GE Aviation
  • Led the implementation of the Proactive Fleet Management project.
Technologies: Cognos ReportNet, Teradata, Solaris, Informatica, Data Warehousing, Data Warehouse Design, Data Engineering, Business Intelligence (BI), Data Modeling, SQL, ETL

ETL Lead

2004 - 2009
GE Healthcare
  • Designed and led the development for a service invoicing project to send electronic statements to customers.
  • Designed and developed Deal Analyzer, a project required to provide the functionality to retrieve historical quotes and customer history and to allow the user to analyze and decide upon quotes based on historical deal data, margin analysis, and other functions.
Technologies: Informatica, Teradata, Oracle, Solaris, Data Warehousing, Data Warehouse Design, Data Engineering, Business Intelligence (BI), Data Modeling, SQL, ETL

Technology Architect

Bayer Pharmaceuticals data warehouse modernization program where the existing on-premises data warehouse and reporting solution is being migrated to AWS Cloud and Cloudera big data infrastructure for shorter time to market. The modernized data warehouse will also provide support for existing BI and operational reporting as well as better data science and analytical capabilities.

First Hawaiian Bank - EDW

System integration with Oracle Data Warehouse and reporting solutions implemented to automate management and regulatory reporting which includes CCAR, IHC, and call reports.

NBC Universal Advertising Sales and Analytics Technology

Analytics Technology: Integrated different systems to the big data (Pivotal HD) environment and sourced data from HDFS to the data warehouse for BI reporting to meet the growing need for extracting value from data assets.

Sochi Olympics: Implemented event reporting for generating following daily reports and send it to key business stakeholders and ad planners.

BI Ad Sales Reporting: Responsible for design and development of data integration for sales and traffic systems.

Union Bank - Finance Reporting and CCAR Projects

Implemented a centralized, automated financial reporting platform for the regulatory, S.E.C., BTMU, and D.E.C. reporting groups that improved accuracy, processing time, and audit ability, reduced risk by improving reporting controls, quality, and transparency, and effectively enabled future state reporting under IFRS and integration of acquisitions.

GE Healthcare - BI COE

Architecture for data integration and reporting for GE Healthcare's business intelligence solution, one of world’s largest warehouses running on Teradata. It is supporting a more than 175 source system which includes CRM, ERP, and legacy systems.

GE Aviation - Proactive Fleet Management

Proactive Fleet Management was developed to give insight of customer fleet maintenance, events, and utilization for GE Aviation fleet data management.

These reports were a combination of summary report with event count for each event type and drill down to each event type gives different charts, which included trend lines for the customer, comparison group, fleet, Pareto for component event drivers, reliability growth charts for forecasting, and trend lines for reliability growth.

GE Healthcare - BI Reporting

Service Statement Invoicing: Service statements for customers was generated on a monthly interval, which included invoices for contracts, hourly billed service, OSM, and back bills and credits both at the line-item level and individual invoice detail level.

Deal Analyzer: The deal analyzer was intended to provide facilities for allowing the pricing team to compare and analyze a deal based on margin, historical data, and other aspects of a given portfolio.

C360: Consolidated customer data in a comprehensive manner to provide knowledge and understanding of customers from a more holistic view and provided the data to help GEHC improve its relationship with the customer and to make smarter, more customer-focused business decisions.

Languages

SQL, Snowflake, Python

Tools

Tableau, Amazon Elastic MapReduce (EMR), Cloudera, Spark SQL, Matillion ETL for Redshift, Apache Airflow, Amazon QuickSight, CVS, Oracle Business Intelligence Enterprise Edition 11g (OBIEE), Cognos ReportNet, Apache Impala

Paradigms

ETL, Business Intelligence (BI), Agile, ETL Implementation & Design

Platforms

Oracle, Amazon EC2, Talend, AWS Lambda, Amazon Web Services (AWS), MacOS, Unix, AIX, Solaris, Linux

Storage

Teradata, Redshift, Amazon S3 (AWS S3), Databases, Apache Hive, HDFS, PostgreSQL, Oracle9i, Oracle PL/SQL

Other

Data Warehousing, Data Architecture, Informatica, Data Warehouse Design, Data Engineering, Data Modeling, ETL Tools, Data Analysis, Google BigQuery, Data Build Tool (dbt), Big Data, SAP BusinessObjects (BO), Trifacta, Fivetran

Frameworks

Apache Spark, Hadoop

Libraries/APIs

Spark Streaming, PySpark

1998 - 2002

Bachelor's Degree in Computer Engineering

Vinoba Bhave University - Hazaribagh, India

NOVEMBER 2019 - NOVEMBER 2022

AWS Certified Solutions Architect Associate

AWS

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.

1

Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.
2

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.
3

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring