Data quality

Data quality at the highest level is one of the basic prerequisites for valid business decisions.

Data quality is a multidimensional construct that is determined by several measurement variables. The individual processes, use cases, users and systems in each organization determine which dimensions are relevant for the data quality in each data set.

GIn general, a high level of data quality is the basis for true data intelligence and thus a fundamental success factor for all data-driven business processes and models. Increasing your data quality creates the optimal conditions for smart decision-making processes and top performance in the digital age.

Slide 1

Check out our tool for the optimization of your data quality:



Data quality with DataRocket

The MDM software DataRocket is characterized by its focus on data quality. We identified data quality as the decisive competitive advantage for all companies. That’s why DataRocket offers the following functions to increase your data quality:

  1. Data analysis: framework based master data quality check
  2. Data cleansing: data correction in the source and target system
  3. MDM heatmap: continuous monitoring of the long-term data quality


Data analysis

The prelude to a successful data management project

Performing a data analysis is a suitable measure to start a master data management project. It gives you an initial overview of the quality level of your data in its current state and allows you to plan further steps based on this knowledge.

The innoscale approach to data analysis:

We use an attribute-based approach. The data quality can be determined on the basis of pipelines, with which you can define individual quality criteria and calculation paths. We carry out individual quality audits for each of our customers. To enable us to build up customer-specific sets of data quality criteria faster and more efficiently, we use templates developed in-house. We offer data quality rule templates for creditor data, debtor data and SAP material data.

We are convinced that data quality must be anchored in the departments involved. Therefore, data quality measurement and control in our projects is built up by the department. The requirements are defined by the departments and ideally also implemented there. Our data quality software DataRocket supports all data managers in their activities by offering data analysis and quality control via a web interface – programming skills are therefore not necessary.


Benefits and results of measuring data quality

Data quality_Benefit-2


The data quality software DataRocket identifies duplicates and incorrect data records.

Data quality_Benefit-1


Types and frequencies of errors and plausibility violations are made transparent and their sources are tracked down.

Data quality Benefit-3


The results are used to determine initial measures for optimisation. In addition, solution approaches, such as recommendations for adjustments, are identified.


Data Cleansing

Correct your data to optimize data quality

Usually, data cleansing is the first approach to restore a correct database as the foundation for improving data quality. Detection and elimination of duplicates plays a decisive role, as does the establishment of validation rules for measuring data quality and monitoring its success.

The duplicate detection, which you can perform with our master data management software DataRocket, checks the entire data set and finds entries that concern the same business object but contain different information. In a process called data harmonization, these entries are merged into one comprehensive, meaningful data set – the Golden Record.

Golden Record

DataRocket acts as a hub in a company’s data landscape and as such accesses heterogeneous data sources. The data records from these sources are extracted and consolidated and then merged into Golden Records. This Golden Record or single point of truth is a master data record that combines the relevant attributes from all data sources.

Not only the elimination of duplicates, but also other corrections improve the data quality:

  • Plausibility violations (e.g. the net weight must always be less than the gross weight of an article)
  • Filling levels and limit values such as minimum and maximum values (e.g. postal codes with a fixed amount of digits)
  • Missing standards for date formats, addresses or phone numbers


Data cleansing with DataRocket

Our master data management software DataRocket offers three different options for data cleansing.

1. Automated data cleansing:

The application of one or more previously defined rules results in updated data (bulk update).


2. Data cleansing workflow:

A workflow is run through in the software and manual corrections are made to the data based on the results.
Workflow for data cleansing in DataRocket

3. Mass update (bulk upload):

A new file with clean data is uploaded to update the data set.


MDM heatmap

Long-term data quality improvement

innoscale AG offers real-time measurements for the continuous monitoring of the data quality in your company. Using the innoscale mdm software DataRocket the current data quality is measured and permanently monitored directly in your systems. The measurement provides results for the following quality criteria, among others:

  • Timeliness and age of the data
  • Consistency, validity, accuracy, completeness and uniqueness
  • Frequency of change

MDM heatmap by innoscale AG


We also offer real-time testing of data quality as a micro-service architecture. You benefit from a centralized view for evaluating your data quality. Freely configurable dynamic web reports are available in the DataRocket software dashboard. In addition, you will receive targeted optimization proposals based on the data quality measurements.