Alexander Hauskrecht, Developer in Stuttgart, Baden-Württemberg, Germany
Alexander is available for hire
Hire Alexander

Alexander Hauskrecht

Verified Expert  in Engineering

Data Quality Developer

Stuttgart, Baden-Württemberg, Germany
Toptal Member Since
March 4, 2020

Alexander has over 20 years of experience in data warehousing in different environments and roles (maintenance, development, administration, test, SWOT-analysis) using several tools (from hand-coded to enhanced ETL tools) and paradigms. He's experienced with historization in DWH and has professional experience with banking and insurance.


Bausparkasse Schwäbisch Hall
Ab Initio, Ab Initio Graphical Development Environment (GDE)...
Finanzinformatik, Münster
IBM z/OS, SAS, JointJS, Elasticsearch
Provinzial NordWest, D-Münster
SAS Business Intelligence (BI), Data Modeling, ETL, SQL, Data Warehouse Design...




Preferred Environment

JasperReports, Pentaho Data Integration (Kettle), Ab Initio, SAS Data Integration (DI) Studio, SAS

The most amazing...

...project was giving business users a self-service, regulatory compliant, reliable process for adjustment of data in several data marts, disbursing IT resources.

Work Experience

ETL Developer

2022 - PRESENT
Bausparkasse Schwäbisch Hall
  • Developed and maintained risk data marts and added new data sources.
  • Managed and monitored the load process of test systems.
  • Replaced manual data sources/processes with ETL processes.
Technologies: Ab Initio, Ab Initio Graphical Development Environment (GDE), Sybase PowerDesigner, Oracle, Automic (UC4)


2021 - PRESENT
Finanzinformatik, Münster
  • Did consultancy work on BCBS 239 and data quality. Developed a data dictionary and data lineage tool for a legacy SAS data warehouse.
  • Conducted a preliminary study on choosing a data quality tool.
  • Consulted on a legacy data warehouse addressing BCBS 239.
Technologies: IBM z/OS, SAS, JointJS, Elasticsearch

Developer and Consultant

2013 - PRESENT
Provinzial NordWest, D-Münster
  • Selected and de-identified test data for all non-production environments. Built a framework for metadata-driven selection and de-identification of production data. More than 1,000 sources from different mainframe storage systems.
  • Stabilized processes, integration of source data, SAS93-migration of data and batch programs, and optimization of run-processing to support the Business Intelligence Center.
  • Consulted on the architecture for an SII data mart and implementation of ETL programs for the Solvency II project.
  • Implemented generic auditing and accepted the application for the adjustment of data in data marts, using fast edit as the data input tool.
  • Designed and implemented an application for calculating bonuses for field service. Used SAS EG for calculation and JasperReports for reporting.
Technologies: SAS Business Intelligence (BI), Data Modeling, ETL, SQL, Data Warehouse Design, Data Warehousing, Erwin, JasperReports, IBM z/OS, IBM Db2, SAS Macros, SAS Enterprise Guide, SAS Data Integration (DI) Studio

Developer and Consultant

2020 - 2021
Bayern Card Services, München
  • Developed form letters using JasperReports to be called from a given application.
  • Reviewed and checked options and best practices for using Jasper (e.g. externalization of standard literals to lower maintenance efforts).
  • Taught internal staff using Jaspersoft Studio to maintain and build new form letters.
Technologies: JasperReports


2020 - 2020
Informationsfabrik GmbH, D-Münster
  • Established a test process in a data warehouse project for a bank.
  • Built a framework for automated tests of data with the option to enhance this as a data quality framework.
  • Consulted with developers to write automated and manual tests.
  • Integrated automated test into the loading process using Data Factory Pipelines.
Technologies: SQL, Data Warehousing, Data Warehouse Design, Microsoft Power BI, Azure DevOps, Test Automation, Azure Data Factory, Pipelines, T-SQL (Transact-SQL), Microsoft SQL Server, Azure

Business Analyst and ETL Developer

2018 - 2019
Bausparkasse Schwäbisch Hall (Building and Loan Facility), D-Schwäbisch Hall
  • Developed and implemented data quality rules with the data owners.
  • Improved communication between IT and business concerning the DQ framework.
  • Helped to stabilize the DQ framework and made it more flexible.
Technologies: Data Modeling, Oracle RDBMS, Data Quality, ETL, SQL, Data Warehouse Design, Data Warehousing, Sybase PowerDesigner, Oracle Database, Ab Initio

Software Architect

2017 - 2018
W&W (Building and Loan Facility and Insurance), D-Ludwigsburg
  • Reviewed and refined the system architecture for a marketing data mart.
  • Created performance tests of DB2 BLU specifics and PoC for the performance of a data vault data model.
  • Implemented a performant component for historization.
Technologies: SAS Business Intelligence (BI), Data Modeling, ETL, SQL, Data Warehouse Design, Data Warehousing, Apache Hive, IBM Db2, SAS


2016 - 2017
BLB (Federal State Bank), D-München
  • Migrated a legacy program from SAS Base to SAS DI. Reengineered and documented the logic. Executed parallel tests.
  • Reengineered a given generic rulest, adapted for use in new topics.
  • Implemented the enhanced generic ruleset.
Technologies: SAS Business Intelligence (BI), ETL, SQL, Data Warehouse Design, Data Warehousing, Linux, IBM Db2, SAS Macros, SAS Data Integration (DI) Studio, SAS


2012 - 2013
Vattenfall (Power Supplier), D-Hamburg
  • Provided consulting and support during the implementation of data marts for risk management concerning design, tool usage, administration, and run.
  • Led historization using SCD-Transformation and a customer specific historization transformation.
  • Supported business units concerning the test of the data mart.
Technologies: ETL, SQL, Data Warehouse Design, Data Warehousing, AIX, Oracle, SAS Enterprise Guide, SAS Business Intelligence (BI)


2012 - 2012
Gmünder Ersatzkasse (Health Insurance), D-Schwäbisch Gmünd
  • Built a mart for marketing data, focus ETL-development.
  • Completed data modeling (Star Model) of the dimensional data mart.
  • Optimized the performance of Pentaho DI jobs and transformations.
Technologies: Data Modeling, ETL, SQL, Data Warehousing, Data Warehouse Design, AIX, IBM Db2, Pentaho


2010 - 2012
Provinzial NordWest, D-Münster (Insurance)
  • Extended and standardized the existing DWH systems.
  • Helped design and set up processes to build a new DWH system (architecture, historization, adjustment etc).
  • Built a historization component with up to three (depending on the business area) historization dimensions. Focused on performance optimization (SAS <=> DB2).
Technologies: SAS Business Intelligence (BI), Data Modeling, ETL, SQL, Data Warehouse Design, Data Warehousing, IBM z/OS, IBM Db2, SAS Macros, Base SAS


2010 - 2011
LBBW (Federal State Bank), D-Stuttgart
  • Consulted, designed, and implemented a data mart for centralized regulatory reporting (AWV).
  • Wrote IT concepts for the integration of source systems and the transformation of data.
Technologies: SAS Business Intelligence (BI), ETL, SQL, Data Warehouse Design, Data Warehousing, AIX, SAS Enterprise Guide, Base SAS


2010 - 2010
Versorgungswerk der Architekten, D-Stuttgart
  • Migrated an existing reporting system (Cobol, Prescribe) to JasperReports.
  • Led a feasibility study to check if the migration can be done within the given budget.
  • Migrated existing reports to JasperReports, and built a a generic Java Wrapper for batch reporting.
Technologies: Java, JasperReports


2008 - 2010
LBBW (Federal State Bank), D-Stuttgart
  • Designed and developed of ETL processes in the credit risk data mart (GLLP, IFRS-Reporting).
  • Implemented a generic process for self-service data adjustment in the data mart.
  • Designed and implemented a dynamic job scheduling driven by delivery of extracts from source systems.
Technologies: SAS Enterprise Guide, SAS

Fast Control - Tool for Automatic Data Testing
Built a tool for automatic testing of data.

Described data and test cases and ran them. The first implementation was SAS-based (SAS EG) and used Microsoft Excel for data entry. The second implementation used Java, Tomcat, and SQL using Fast Edit as the front-end.

Adjustment of Data

Built a tool to adjust data in several stages of a data warehouse system (derived area and data marts). It's used as a self-service tool by businesses without IT participation but still adheres to regulatory requirements.

Uses Apparo Fast Edit as data entry and a generic SAS macro for the implementation.

Historize-, Write- and Extract Module for Staging Layer of a DWH.

The Task:
Find a way to store daily delta data in the staging layer of a data warehouse and historize it.

- Historization has to be implemented in several variants, depending on the source system (no historization or 1 to 3 dimensions).
- Each source system consists of between 1 and 20 tables/entities.
- DB2 is used as a storage layer, ETL is done with the SAS foundation.

The Solution:
- Build a metadata-driven generic historization and storage macro in SAS, handling
- Historization (or no historization) in a central historization table per source system.
- Optimize historization and read/write between SAS and DB2 using push down techniques (DB2 temporary tables, Push Down SQL).
- Write delta rows into DB2 using the DB2 Load facility.
- Build a metadata-driven generic extraction macro in SAS handling
- Extract all tables of a source system in the storage layer for a given time interval
- Add historization information
1988 - 1992

Master's Degree in Business Adminstration

Fachhochschule fuer Wirtschaft - Pforzheim, Germany


Professional Scrum Master I


SAS Certified Data Integration Developer for SAS9



Prince2 Foundation

APM Group


ISTQB Certified Tester Foundation level

Cert Insitute


JasperReports, JointJS


Pentaho Data Integration (Kettle), Microsoft Power BI, SAS Data Integration (DI) Studio, Sybase PowerDesigner, SAS Business Intelligence (BI), SAS Enterprise Guide, Erwin, Ab Initio, Automic (UC4)


ETL, Test Automation, Azure DevOps


SAS, SQL, Java, T-SQL (Transact-SQL)


IBM Db2, Oracle RDBMS, Apache Hive, Microsoft SQL Server, Elasticsearch


Oracle Database, Linux, Oracle, AIX, Pentaho, IBM z/OS, Azure


Data Warehouse Design, Data Warehousing, Data Modeling, COBOL Batch, Data Quality, SAS Macros, Base SAS, Pipelines, Azure Data Factory, Ab Initio Graphical Development Environment (GDE)

Collaboration That Works

How to Work with Toptal

Toptal matches you directly with global industry experts from our network in hours—not weeks or months.


Share your needs

Discuss your requirements and refine your scope in a call with a Toptal domain expert.

Choose your talent

Get a short list of expertly matched talent within 24 hours to review, interview, and choose from.

Start your risk-free talent trial

Work with your chosen talent on a trial basis for up to two weeks. Pay only if you decide to hire them.

Top talent is in high demand.

Start hiring