Master Data Management
Control your data - control your business!
Master Data Management (MDM) combines all the organizational and/or technology-based activities for the sustained improvement of the enterprise-wide master data. These take place in cooperation between various business units and the IT department. The improvement of the data ideally occurs, when entering the data.
Major goal of Master Data Management is to identify Golden Records (verified data sets) and to get 360° view on your data.
‘Master data’ are unique and refer to the core business objects in your business. They are stable, aren’t changed frequently, are used in multiple departments and have high value.
Biggest challenges for companies in master data management:
Master Data Management provides the following advantages for your business:
Identification of “Golden Records”.
Creating a 360-degree view of the data
The integration of data from various IT systems ( CRM , accounting, or EXCEL ) allows a consistent view of the data . Incomplete , duplicate and inconsistent data sets are unified and displayed centrally . Find out if Peter Müller and Peter Mueller are the same person.
Illustration of the relationships between data from different sources
Data from various IT systems are interdependent. Through rules ( mapping ) these dependencies are clear. So you can find out, how much revenue your company has made with the “right” Peter Müller.
Control over business processes
Master data are used and processed by different people and different departments. Keep track of the changes in the customer data, associated product or payment functions and manage which individuals are allowed to make data changes. You can configure that e.g. only the account manager of Mr. Müller is allowed to change his master data.
Data is essential - integrated, comprehensible and high-quality data is priceless
Data quality is the benchmarking of data. It depends on how well the data is suitable for a purpose to serve in a particular context. In companies sufficient data quality is essential to operational and transactional processes. Keep in mind that the reliability of analyses and reports is based on the data.
Data quality is affected by the way data is entered, saved and managed. The verification process of reliability and effectivity of data is referred to as data quality management.
Preserving data quality implies checking and cleaning databases regularly.
Common data quality problems include updates, standardizations, validations, plausibility checks and duplications of records.
Data quality can be measured using specific criteria! Some, or none, of the following criteria might apply to improve your data quality:
Creation of value
The value of data quality for enterprises has to be customly defined early on in MDM-projects (Data Screening). Data Rocket offers preconfigured rules in addition to an editor (Data pipelines), where you can define individual quality criteria and calculation paths to get the most out of your data. DataRocket standardizes the data structure, enables automated detection of data problems and all in all a measurable improvement of your data quality.
The benefits that you can achieve in your business through high quality data are highly dependent on the available data and relevant business processes and therefore always individual. Hence we recommend an initial screening of the data landscape at the beginning of a master data project. Typical advantages that are achieved by high data quality , are the following:
- Optimized, accurate and fast processes
- Increased customer and personnel satisfaction
- Better evaluations and reports
- Optimized Warehouse and Order Management
- Simplified implementation of new systems, new standards or organizational techniques (e.g. ITIL)
- Reliability and effectiveness of data
- Transmission of high-quality data as a service to partners
- Simplified and correct (cost) controlling
- Cost savings and reduction in capital intensity
- Simplified and accurate resource planning
- Establishment of knowledge management (control and test criteria for data management)
Data Rocket standardizes the data structure, enables automated detection of data problems and provides measurable improvement in data quality through the following features:
- Workflow-based cleanup assistant
- Duplicate check
- Formal data validation
- Content data validation
- Plausibility check
- Freely configurable quality criteria (DataPipelines)
- Data enrichment with external information
On which lever is your master data management? With the help of DataCanvas you will easily find out and identify potentials to optimize your master data management.
- Overview of current activities
- Better corporate wide understanding of MDM activities
- Placement of current MDM activities based on maturity model (MDM3)
- Cross-departmental discussion of activities
- Learnings from other departments and external input from experts
- Identification of new challenges and actions
- Structured method for further MDM projects
- Helps to develop a corporate wide MDM strategy
- Continuous usage supports controlling and reporting
- Iterative improvement of MDM activities based on predefined stages
- Big Picture for a better overview
- Flexible and agile to adopt to your style
- There is no wrong or right, every thought is welcome
- Encourage creativity and aim for quantity
- A picture is worth a thousand words
- Structured, efficient & fast
- Dynamic requires continuous repetition
- 2 days / 4-5 hours per day
- 4-6 participants
- 1st day: identifying current situation
- 2nd day: developing of strategy
- Held by 2 MDM-experts
- Record of results with recommendations for action and positioning in maturity model MDM3
Datamigration and -consolidation
Data migrations and consolidations are tedious - and we're there to help you!
Modernization projects, acquisitions, mergers or consolidation projects have a direct impact on your system landscape. For example the application portfolio changes and data must be moved from System A to System B – in short: you are facing a typical migration project. It’s irrelevant whether your data is in the cloud or on your servers and if it’s a customer- or mainframe system.
A data migration is the process of data being transferred in between storage systems, data formats, or computer systems.
Data migrations are one of the most common problems of poor data quality. Poor data quality leads to delayed migration projects and overspending of your budget.
Data Rocket supports the three key steps of a data migration (extraction, transformation and loading). During the migration of data from one system to another the data quality can be improved – the big advantage is that thereby only “clean” data is migrated. The data is transferred via batch import or bidirectional interface to the source data systems.
- When migrating data, high quality data simplifies the project and increases the probability of success.
- A direct improvement of data quality during migration reduces expenses and creates synergies.
- Expense and cost reduction in data migration projects.
- Consistent data in all staging-systems and noticing of deviations.
- Consistent data storage in productive and test systems
- Use of existing interfaces
- Simple mapping using drag-and-drop
- Unified data model as a basis
Total Quality Management
MDM and DQ-Projects are about team work!
Total Quality Management ( TQM ) is a management approach to optimize the quality of products and services of a company in all functional areas and at all levels by participation of all employees. DataRocket transfers this proven approach to improve data quality and Master Data Management.
- Humans make mistakes
- Individual employees are responsible for the
- Zero errors is not feasible
- Central control
- Processes provoke errors
- All employees are responsible for high data quality
- Zero errors is the goal
- Mutual control inside the team
- The load to improve data quality is distributed over many shoulders
- Those who are professionally able to correct the data, are given the tasks
- The result is a “WE – feeling” and a general appreciation of data and its quality
- The quality of the data will be better protected by the wisdom of the crowd and mutual control
- Integration of Master Data Management into existing workflows and work environments (e.g. MS Sharepoint)
- Task management
- Clean-up Wizard and release of error lists to certain individuals or groups
- Change history (Audit-Trail)
- Ability to share data changes and quality criteria
Open Data Enrichment
Extend your database with existing reference data - used to validate your data
External data sources e.g. professional data services or publicly available data sources are used to carry out data validations or plausibility checks (checks for logic and determination of plausibility).
External data can be used to…
- …complete missing customer or address information
(e.g. add missing zip)
- …validate existing data
(e.g. fits the zip to the city)
- …update existing data
(e.g. exists the phone number yet)
- …gain new information about customers (Data Mining)
(e.g. at what point is the willingness to buy particularly high)
Through the enrichment of your data you can significantly improve your data quality. You can quickly identify new customers, contact them more specifically and thereby increase your conversion rate.
DataRocket offers two options for enriching your data