Solutions

Master Data Management

Control your data - control your business!

Master Data Management (MDM) combines all the organizational and/or technology-based activities for the sustained improvement of the enterprise-wide master data. These take place in cooperation between various business units and the IT department. The improvement of the data ideally occurs, when entering the data.

Major goal of Master Data Management is to identify Golden Records (verified data sets) and to get 360° view on your data. 

 

‘Master data’  are unique and refer to the core business objects in your business. They are stable, aren’t changed frequently, are used in multiple departments and have high value.

Biggest challenges for companies in master data management:

Master Data Management provides the following advantages for your business:

Identification of “Golden Records”.
Creating a 360-degree view of the data

The integration of data from various IT systems ( CRM , accounting, or EXCEL ) allows a consistent view of the data . Incomplete , duplicate and inconsistent data sets are unified and displayed centrally . Find out if Peter Müller and Peter Mueller are the same person.

Illustration of the relationships between data from different sources

Data from various IT systems are interdependent. Through rules ( mapping ) these dependencies are clear. So you can find out, how much revenue your company has made with the “right” Peter Müller.

 

Control over business processes

Master data are used and processed by different people and different departments. Keep track of the changes in the customer data, associated product or payment functions and manage which individuals are allowed to make data changes. You can configure that e.g. only the account manager of Mr. Müller is allowed to change his master data.

 

Data Quality

Data is essential - integrated, comprehensible and high-quality data is priceless

Data quality is the benchmarking of data. It depends on how well the data is suitable for a purpose to serve in a particular context. In companies sufficient data quality is essential to operational and transactional processes. Keep in mind that the reliability of analyses and reports is based on the data.

Data quality is affected by the way data is entered, saved and managed. The verification process of reliability and effectivity of data is referred to as data quality management.
Preserving data quality implies checking and cleaning databases regularly.

Common data quality problems include updates, standardizations, validations, plausibility checks and duplications of records.

Data quality can be measured using specific criteria! Some, or none, of the following criteria might apply to improve your data quality:

The data is up to date when you see the features of each item described appear as they are at that moment.

Up-to-dateness

The data adds value when using it leads to a quantifiable increase in a monetary function..

Creation of value

The information is complete when nothing is missing and it is available for the uses that have been set at each stage of the process.

Completeness

The data is of an appropriate size when the amount of information available is enough for the needs that have been set.

Appropriate Size

Data is relevant when it provides the user with the information needed.

Relevant

The data is comprehensible when the user can understand it immediately and use it to fulfil their needs.

Comprehensibility

The data is clear when the exact information needed is available in the right format, in an easily comprehensible manner.

Clarity

The information is presented consistently when the information appears in the same manner constantly.

Consistent Presentation

Information is clearly explained when it appears in a manner which is correct in a technical sense.

Clear Explanation

The data is credible when certificates show a high standard of quality or significant efforts are made in information gathering and dissemination.

Credibility

The data is objective when it strictly factual and free of value judgment.

Objectivity

The data is error-free when it corresponds with the situation as it is in reality.

Error Free

The data has the user’s trust when the source of information, method of transmission and processing system are well-regarded for their credibility and competence.

High Regard

The data is malleable when it is easy to change and use for different purposes.

Malleability

The data is accessible when the user can access it through a simple process and direct method.

Accessibility

The value of data quality for enterprises has to be customly defined early on in MDM-projects (Data Screening). Data Rocket offers preconfigured rules in addition to an editor (Data pipelines), where you can define individual quality criteria and calculation paths to get the most out of your data. DataRocket standardizes the data structure, enables automated detection of data problems and all in all a measurable improvement of your data quality.

 

The benefits that you can achieve in your business through high quality data are highly dependent on the available data and relevant business processes and therefore always individual. Hence we recommend an initial screening of the data landscape at the beginning of a master data project. Typical advantages that are achieved by high data quality , are the following:

  • Optimized, accurate and fast processes
  • Increased customer and personnel satisfaction
  • Better evaluations and reports
  • Optimized Warehouse and Order Management
  • Simplified implementation of new systems, new standards or organizational techniques (e.g. ITIL)
  • Reliability and effectiveness of data
  • Transmission of high-quality data as a service to partners
  • Simplified and correct (cost) controlling
  • Cost savings and reduction in capital intensity
  • Simplified and accurate resource planning
  • Establishment of knowledge management (control and test criteria for data management)

Data Rocket standardizes the data structure, enables automated detection of data problems and provides measurable improvement in data quality through the following features:

  • Workflow-based cleanup assistant
  • Duplicate check
  • Formal data validation
  • Content data validation
  • Plausibility check
  • Freely configurable quality criteria (DataPipelines)
  • Data enrichment with external information

DataCanvas

On which lever is your master data management? With the help of DataCanvas you will easily find out and identify potentials to optimize your master data management.

The DataCanvas is a strategic management and entrepreneurial tool. It allows Enterprises to understand, structure, categorize, evaluate, control and optimize their Master Data Management.
The DataCanvas is composed of four major Elements: (1) Objectives; (2) Building Blocks; (3) Actions; (4) Maturity Modell (MDM3).
The whole DataCanvas is run through iteratively and discussed in the group. Using the DataCanvas you can systematically structure your running master data Management and position your current MDM-Activities in the Maturity Modell (MDM3).
We will be happy to presemt you our DataCanvas  free of charge and without any obligations or run it through with your team in a workshop at your office or site.
  • Overview of current activities
  • Better corporate wide understanding of MDM activities
  • Placement of current MDM activities based on maturity model (MDM3)
  • Cross-departmental discussion of activities 
  • Learnings from other departments and external input from experts
  • Identification of new challenges and actions 
  • Structured method for further MDM projects 
  • Helps to develop a corporate wide MDM strategy 
  • Continuous usage supports controlling and reporting
  • Iterative improvement of MDM activities based on predefined stages

 

  1. Big Picture for a better overview
  2. Flexible and agile to adopt to your style
  3. There is no wrong or right, every thought is welcome
  4. Encourage creativity and aim for quantity
  5. A picture is worth a thousand words
  6. Structured, efficient & fast
  7. Dynamic requires continuous repetition
  • 2 days / 4-5 hours per day
  • 4-6 participants
  • 1st day: identifying current situation
  • 2nd day: developing of strategy
  • Held by 2 MDM-experts
  • Record of results with recommendations for action and positioning in maturity model MDM3

 

Datamigration and -consolidation

Data migrations and consolidations are tedious - and we're there to help you!

Modernization projects, acquisitions, mergers or consolidation projects have a direct impact on your system landscape. For example the application portfolio changes and data must be moved from System A to System B – in short: you are facing a typical migration project. It’s irrelevant whether your data is in the cloud or on your servers and if it’s a customer- or mainframe system.

A data migration is the process of data being transferred in between storage systems, data formats, or computer systems.

Data migrations are one of the most common problems of poor data quality. Poor data quality leads to delayed migration projects and overspending of your budget.

Data Rocket supports the three key steps of a data migration (extraction, transformation and loading). During the migration of data from one system to another the data quality can be improved – the big advantage is that thereby only “clean” data is migrated. The data is transferred via batch import or bidirectional interface to the source data systems.

  • When migrating data, high quality data simplifies the project and increases the probability of success.
  • A direct improvement of data quality during migration reduces expenses and creates synergies.
  • Expense and cost reduction in data migration projects.
  • Consistent data in all staging-systems and noticing of deviations.
  • Consistent data storage in productive and test systems
  • Use of existing interfaces
  • Simple mapping using drag-and-drop
  • Unified data model as a basis

Total Quality Management

MDM and DQ-Projects are about team work!

Total Quality Management ( TQM ) is a management approach to optimize the quality of products and services of a company in all functional areas and at all levels by participation of all employees. DataRocket transfers this proven approach to improve data quality and Master Data Management.

Classical mindset in Master Data Management:

  • Humans make mistakes
  • Individual employees are responsible for the
    data quality
  • Zero errors is not feasible
  • Central control

 

Classical mindset in Master Data Management following the Total-Quality-management approach:

  • Processes provoke errors
  • All employees are responsible for high data quality
  • Zero errors is the goal
  • Mutual control inside the team

 

  • The load to improve data quality is distributed over many shoulders
  • Those who are professionally able to correct the data, are given the tasks
  • The result is a “WE – feeling” and a general appreciation of data and its quality
  • The quality of the data will be better protected by the wisdom of the crowd and mutual control
  • Integration of Master Data Management into existing workflows and work environments (e.g. MS Sharepoint)
  • Task management
  • Clean-up Wizard and release of error lists to certain individuals or groups
  • Change history (Audit-Trail)
  • Ability to share data changes and quality criteria

Open Data Enrichment

Extend your database with existing reference data - used to validate your data

To be able to optimize plausibility controls and data validation, you need additional data.
External data sources e.g. professional data services or publicly available data sources are used to carry out data validations or plausibility checks (checks for logic and determination of plausibility).

External data can be used to…

  • …complete missing customer or address information
    (e.g. add missing zip)
  • …validate existing data
    (e.g. fits the zip to the city)
  • …update existing data
    (e.g. exists the phone number yet)
  • …gain new information about customers (Data Mining)
    (e.g. at what point is the willingness to buy particularly high)

Through the enrichment of your data you can significantly improve your data quality. You can quickly identify new customers, contact them more specifically and thereby increase your conversion rate.

infografik2_eng

 

DataRocket offers two options for enriching your data

 1. Use of Open Data

Open Data are freely available digital web data that may be used free of cost without legal restrictions. These are for example information from Open Street Map or Wikipedia, but also data from social media such as XING, Facebook or G+.

Data Rocket has interfaces to web services and can search for specific data, which are then used to improve your data quality or to optimize your sales process.

 2. Use of professional data services

Existing data dispatchers (e.g. German Post Direkt, Universal Postal Service, Creditreform) are used. The data is acquired from the service providers and gets transferred to Data Rocket for adjustment and integration into the data base.

 

DE | EN