DataRocket

What is DataRocket?

DataRocket is a new generation of master data management

DataRocket is multi-domain master data management software. It is unique in its range of functions – including analysis, data-cleansing and reporting – and its super simple operability. You can individually define the criteria to measure your data quality. DataRocket will become the centerpiece for data quality management in your company.

25_rakete_final_gepunktet

 

See in 2 minutes and 29 seconds how simple data management can be with DataRocket – clearly explained!

Functions

DataRocket comprises four modules: the cockpit, designer, analyzer and browser.

From the Cockpit, you get an overview of what is happening. You can see how data quality has changed over the preceding few days and follow changes that were recently made.

  • Overview of the workflow coordination of the data quality improvement measures (e.g. approval processes and quality gates)
  • Individual reports on data quality in all data sources and data criteria
  • Long-term measures of success through a system of definable indicators
  • Activity and change history (audit trail) for all activities
  • Access to the user management system (management of roles and rights)

 

With the designer, you can choose your data model, create links between data sources and define your policy for measuring data quality.

  • Easy creation of data models using ones generated automatically or chosen from a pool of references
  • Connect (importing/exporting) data to hub interfaces
  • Define relationships between data sources
  • Assign datasets from different data sources (e.g., choosing leading attributes, data pre-processing)
  • Define the overall policy for securing data quality
  • Apply individual quality criteria in the editor by choosing components using the Lego Principle
  • Freely configurable computation paths for complex quality criteria (data pipelines)
  • Select and integrating external reference data sources for data validation/enrichment (e.g., addresses, phone numbers, open data)
  • Apply industry standards (e.g., eClass or ISO8000) as a templates

In the Analyzer, you can assess the evaluation of the data pipelines (quality criteria) over time. You define your Golden Records (defined datasets) and set the upper values of data quality with key figures that are measureable.

  • Analyze the databases using the defined quality criteria
  • Identify duplicates and consolidate them
  • Identify and cleanse inconsistent datasets
  • Update databases
  • Data validation/plausibility checks
  • Export datasets and quality reporting
  • Improve of data in real time or by batch
  • Assess data quality over time
  • Display the monetization potential (e.g., by integrating activity-based costing)
  • Cleansing tool that supports workflow

In the browser, you have a 360° overview of your data. Search and filter your data in real-time.

  • 360° view of your data and data quality
  • Search and edit across your databases from a centralized location
  • Filter and visualize your databases
  • Transfer reports in a collaborative setting (e.g., SharePoint)
  • Dynamic visualizations of data (e.g., heat maps, visualization on a map)
  • Excel-like views and input masks for processing data

DataPipeline

Analyze your data quality with predifined components - step by step in our DataPipeline!

The DataPipeline is a process- and component-based function that allows you to define and measure and controll the data quality. It will greatly improve your data quality step by step through a unique method. You can go directly to the errors and, over time, measure and observe the effect of data quality improvement activities. Component by component, you can build up your own unique quality pipeline.

screen_pipeline@1x_flächenschatten_diagramme_eng

We show how the DataPipeline works by a simple example:

BO_arrow_eng
Filter_arrow_eng
duplikate_arrow_eng
qc_arrow_eng
calc_arrow_eng
diagram_eng
Step 1: Choice of database: here, we have a database of 10 product data sets named “Components”.
Step 2: Filter by category: out of the 10 entries, 2 are excluded. The result: 8 filtered product data sets.
Step 3: Recognition of duplicates: a duplicate is identified and two datasets are consolidated. Data volume: 7 product data sets.
Step 4: Definition of quality attributes: here, we check for completeness. 2 data sets do not meet the criteria – 5 entries are complete.
Step 5: Calculation of key figures, e.g., order values based on deficient data.
Step 6: Showing the results using the diagram component.

 

Values of DataRocket

Good data quality helps your business succeed! DataRocket makes sure you have high quality data!

Improving data quality is directly linked to how successful and efficient your business is. Your data represents the central source of information in your business. Its quality has a direct influence on the success of your business. DataRocket is the simplest software solution for improving data quality professionally in your company.

Data sets can easily be merged, analyzed and cleaned, no matter where the data originates:

Thanks to simple operation and innovative concept, DataRocket will deliver best results for data quality in the whole company.

Values:

  • Increased productivity through an optimized, error-free, integrated database
  • Security of investment and reduction in capital intensity
  • Improved customer satisfaction
  • Improved work environment and staff satisfaction
  • Better-substantiated decisions that produce stronger results
  • Improved collaboration within and among different departments and locations

Advantages:

  • Simple data migrations
  • Continuous duplicate cleansing
  • Easy analyses of your databases
  • Better evaluations and reports
  • Fast implementation of new systems and standards, or organizational technology (e.g., ITIL)
  • Simple and accurate (cost) controlling
  • Development of knowledge management (policy and testing criteria for data management)

Interfaces

Choice of existing interfaces and Web Service connectivity for DataRocket

IT Applications

  • SAP (ERP+BW)
  • Update CRM
  • Sugar CRM
  • JIRA
  • IBM SM7

Databases

  • MySQL
  • SQL-Lite
  • Oracle
  • Microsoft SQL Server
  • MongoDB

Office Applications

  • Excel
  • Access
  • CSV-Data
  • XML

Social Media + Open Data 

  • Facebook
  • XING
  • LinkedIn
  • Wikipedia
  • Google+
  • Open Street Map

 

We offer further interfaces through our integration partners. We love to create solutions for our clients for client-specific systems or for use with their own applications.

Prices

26_datarocket_module_cloud
Our licensing model is simple! You can rent DataRocket by the month or make a one-off purchase. With both options, you receive the following services:

Access to all modules:

Cockpit
Designer
Analyzer
Browser

Unlimited number of:

Users
DataSources
DataPipelines
Analyses
Reports
Updates

As an option, you can choose the Enrichment Service and get access to the reference data models. The monthly rent includes a service fee for software care and maintenance.

You can enquire about pricing for DataRocket by speaking to Mr. Dieter Zimmerman.

Security

Safety First: The security of your data is our top priority.

Whether you are using DataRocket in the cloud or on the premises, the security of your sensitive data is always our top priority. We will process the data under the latest enterprise layered security and it will be transferred only in an encrypted format. DataRocket installations in the cloud will be kept in our own dedicated server in a high-security server center in Germany. We do not use any servers outside the country for our clients’ data location.

Sicherheit

 

Live Demo

Get an idea of how DataRocket works in a free, 30-minute live demo

DE | EN