Verified Expert in Engineering
Data Quality Developer
Alexander has over 20 years of experience in data warehousing in different environments and roles (maintenance, development, administration, test, SWOT-analysis) using several tools (from hand-coded to enhanced ETL tools) and paradigms. He's experienced with historization in DWH and has professional experience with banking and insurance.
JasperReports, Pentaho Data Integration (Kettle), Ab Initio, SAS Data Integration (DI) Studio, SAS
The most amazing...
...project was giving business users a self-service, regulatory compliant, reliable process for adjustment of data in several data marts, disbursing IT resources.
Bausparkasse Schwäbisch Hall
- Developed and maintained risk data marts and added new data sources.
- Managed and monitored the load process of test systems.
- Replaced manual data sources/processes with ETL processes.
- Did consultancy work on BCBS 239 and data quality. Developed a data dictionary and data lineage tool for a legacy SAS data warehouse.
- Conducted a preliminary study on choosing a data quality tool.
- Consulted on a legacy data warehouse addressing BCBS 239.
Developer and Consultant
Provinzial NordWest, D-Münster
- Selected and de-identified test data for all non-production environments. Built a framework for metadata-driven selection and de-identification of production data. More than 1,000 sources from different mainframe storage systems.
- Stabilized processes, integration of source data, SAS93-migration of data and batch programs, and optimization of run-processing to support the Business Intelligence Center.
- Consulted on the architecture for an SII data mart and implementation of ETL programs for the Solvency II project.
- Implemented generic auditing and accepted the application for the adjustment of data in data marts, using fast edit as the data input tool.
- Designed and implemented an application for calculating bonuses for field service. Used SAS EG for calculation and JasperReports for reporting.
Developer and Consultant
Bayern Card Services, München
- Developed form letters using JasperReports to be called from a given application.
- Reviewed and checked options and best practices for using Jasper (e.g. externalization of standard literals to lower maintenance efforts).
- Taught internal staff using Jaspersoft Studio to maintain and build new form letters.
Informationsfabrik GmbH, D-Münster
- Established a test process in a data warehouse project for a bank.
- Built a framework for automated tests of data with the option to enhance this as a data quality framework.
- Consulted with developers to write automated and manual tests.
- Integrated automated test into the loading process using Data Factory Pipelines.
Business Analyst and ETL Developer
Bausparkasse Schwäbisch Hall (Building and Loan Facility), D-Schwäbisch Hall
- Developed and implemented data quality rules with the data owners.
- Improved communication between IT and business concerning the DQ framework.
- Helped to stabilize the DQ framework and made it more flexible.
W&W (Building and Loan Facility and Insurance), D-Ludwigsburg
- Reviewed and refined the system architecture for a marketing data mart.
- Created performance tests of DB2 BLU specifics and PoC for the performance of a data vault data model.
- Implemented a performant component for historization.
BLB (Federal State Bank), D-München
- Migrated a legacy program from SAS Base to SAS DI. Reengineered and documented the logic. Executed parallel tests.
- Reengineered a given generic rulest, adapted for use in new topics.
- Implemented the enhanced generic ruleset.
Vattenfall (Power Supplier), D-Hamburg
- Provided consulting and support during the implementation of data marts for risk management concerning design, tool usage, administration, and run.
- Led historization using SCD-Transformation and a customer specific historization transformation.
- Supported business units concerning the test of the data mart.
Gmünder Ersatzkasse (Health Insurance), D-Schwäbisch Gmünd
- Built a mart for marketing data, focus ETL-development.
- Completed data modeling (Star Model) of the dimensional data mart.
- Optimized the performance of Pentaho DI jobs and transformations.
Provinzial NordWest, D-Münster (Insurance)
- Extended and standardized the existing DWH systems.
- Helped design and set up processes to build a new DWH system (architecture, historization, adjustment etc).
- Built a historization component with up to three (depending on the business area) historization dimensions. Focused on performance optimization (SAS <=> DB2).
LBBW (Federal State Bank), D-Stuttgart
- Consulted, designed, and implemented a data mart for centralized regulatory reporting (AWV).
- Wrote IT concepts for the integration of source systems and the transformation of data.
Versorgungswerk der Architekten, D-Stuttgart
- Migrated an existing reporting system (Cobol, Prescribe) to JasperReports.
- Led a feasibility study to check if the migration can be done within the given budget.
- Migrated existing reports to JasperReports, and built a a generic Java Wrapper for batch reporting.
LBBW (Federal State Bank), D-Stuttgart
- Designed and developed of ETL processes in the credit risk data mart (GLLP, IFRS-Reporting).
- Implemented a generic process for self-service data adjustment in the data mart.
- Designed and implemented a dynamic job scheduling driven by delivery of extracts from source systems.
Fast Control - Tool for Automatic Data Testinghttp://www.ara-tec.com/fast-control
Described data and test cases and ran them. The first implementation was SAS-based (SAS EG) and used Microsoft Excel for data entry. The second implementation used Java, Tomcat, and SQL using Fast Edit as the front-end.
Adjustment of Data
Uses Apparo Fast Edit as data entry and a generic SAS macro for the implementation.
Historize-, Write- and Extract Module for Staging Layer of a DWH.
Find a way to store daily delta data in the staging layer of a data warehouse and historize it.
- Historization has to be implemented in several variants, depending on the source system (no historization or 1 to 3 dimensions).
- Each source system consists of between 1 and 20 tables/entities.
- DB2 is used as a storage layer, ETL is done with the SAS foundation.
- Build a metadata-driven generic historization and storage macro in SAS, handling
- Historization (or no historization) in a central historization table per source system.
- Optimize historization and read/write between SAS and DB2 using push down techniques (DB2 temporary tables, Push Down SQL).
- Write delta rows into DB2 using the DB2 Load facility.
- Build a metadata-driven generic extraction macro in SAS handling
- Extract all tables of a source system in the storage layer for a given time interval
- Add historization information
SAS, SQL, Java, T-SQL (Transact-SQL)
ETL, Test Automation, Azure DevOps
Data Warehouse Design, Data Warehousing, Data Modeling, COBOL Batch, Data Quality, SAS Macros, Base SAS, Pipelines, Azure Data Factory, JointJS, Ab Initio Graphical Development Environment (GDE)
Pentaho Data Integration (Kettle), Microsoft Power BI, SAS Data Integration (DI) Studio, Sybase PowerDesigner, SAS Business Intelligence (BI), SAS Enterprise Guide, Erwin, Ab Initio, Automic (UC4)
IBM Db2, Oracle RDBMS, Apache Hive, Microsoft SQL Server, Elasticsearch
Oracle Database, Linux, Oracle, AIX, Pentaho, z/OS, Azure
Master's Degree in Business Adminstration
Fachhochschule fuer Wirtschaft - Pforzheim, Germany
Professional Scrum Master I
SAS Certified Data Integration Developer for SAS9
ISTQB Certified Tester Foundation level
How to Work with Toptal
Toptal matches you directly with global industry experts from our network in hours—not weeks or months.
Share your needs
Choose your talent
Start your risk-free talent trial
Top talent is in high demand.Start hiring