Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Wiki Home
Collapse Expand Close

Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Master Data Management Solution Offering

From MIKE2.0 Methodology

Share/Save/Bookmark
Jump to: navigation, search
Hv3.jpg This Solution Offering currently receives Major Coverage in the MIKE2.0 Methodology. Most required activities are provided through the Overall Implementation Guide and SAFE Architecture, but some Activities are still missing and there are only a few Supporting Assets. In summary, aspects of the Solution Offering can be used but it cannot be used as a whole.
A Creation Guide exists that can be used to help complete this article. Contributors should reference this guide to help complete the article.

Contents

Introduction


Note: This page covers Master Data Management (MDM) and Customer Data Integration (CDI) that is considered a subset of MDM.

Master Data Management (MDM) is a framework of processes and technologies aimed at creating and maintaining an authoritative, reliable, and sustainable, accurate, and secure data environment that represents a “single version of truth,” an accepted system of record used both intra- and interenterprise across a diverse set of application systems, lines of business, and user communities. (Alex Berson and Larry Dubov “Master Data Management and Customer Data Integration for a Global Enterprise” McGraw Hill, 2007)

The Reference/Master Data Management Solution Offering provides an approach for managing master data such as customer, product, employee, locality and partner data and its complementary reference data such as product codes, country codes and foreign exchange rates. The approach in this solution offering can be applied to an environment where this data must be shared or integrated across many systems; it can be implemented using a centralized model or highly federated architecture. Key enabling techniques includes real-time integration, hierarchy management and integrated Operational Data Store design.

The Reference/Master/Indicative Data Management Solution Offering provides an approach for managing master data. Generally speaking each industry and enterprise decides what entities are to be considered as enterprise master entities and handled as such. Typically the entities such as customer, product, employee, locality and partner data and its complementary reference data such as product codes, country codes and foreign exchange rates are considered to be master data.

The approach in this solution offering can be applied to an environment where this data must be shared or integrated across many systems; it can be implemented using a centralized model or highly federated architecture that is more typical for enterprise implementations. Key enabling techniques include real-time integration, customer and address matching, hierarchy management, integrated Data Hub products and solutions, and external data enrichment vendor solutions for customer, product and other data. MDM approach in Mike2.0 combines a broad and holistic MDM vision with the depth required to address solution specifics such as MDM variants, Data Hub architecture styles etc.

Customer Data Integration (CDI) is the most important variant of MDM. Many MDM initiatives begin with the customer (CDI) focus and then expand to include other corporate entities. Customer/Address matching is the area at the heart of CDI. The term Customer is sometimes replaced with a more generic term “Party” to which CDI technologies apply too. Indeed, CDI techniques apply to any entities representing individuals, e.g. Customer, Contact, Patient, Citizen, Medical Professional or Organizations, Companies, Funds, Trusts, Institutions, etc

Executive Summary

Despite being a term that has only reached popularity in the last few years, Master Data Management (MDM) is really a new turn on an old problem.

Managing master data such as customer, product, employee, locality and partner data has always been a major challenge – ever since organisations have tried to share or integrated data across systems. It has now reached the point, however, where the problem is in the front of mind of many senior stakeholders. The master management marketplace has grown significantly since 2002 and is predicated to continue to grow aggressively over the next 5 years. Organisations are spending very large amounts of money on their Master Management programmes and they want to ensure their investment is sound.

Much of the Data you hold is Master Data

Although data is often looked at on a transactional basis, master data typically makes up a large a percentage of the data elements in any given transaction. Examples of master data include:

  • Customer data (name, contact details, DOB, customer classification)
  • Locality data (physical address, postal address, geographical data)
  • Product data (item number, bill of materials, product codes)
  • Employee data (employee number, role, placement in organisational structure)
  • Partner data (partner name, classification)

It is not unusual for this same data to be held in dozens or even hundreds of applications across a large organisation. Much of the data has been held in legacy systems for years and may be held in a fashion where data is poorly integrated and at low levels of quality. Many organisations have poorly implemented Data Governance processes to handle changes in this data over time.

Why MDM is now a Mainstream Issue

We see the following primary reasons why MDM is now a mainstream issue:

  • The first is the business impact of MDM issues. What is a business without its customers, its products and its employees? Master data is some of the most important data that an organisation holds and there is no choice but to fix the issues of the past; even minor issues with master data cause viral problems when propagated across a federated environment. A recognition that enterprise MDM defines competitive advantage has grown significantly in the last decade.
  • Another reason is ever-increasing complexity and globalization. Master Data Management really hits right to the point of the drivers for an Information Development approach. Organisations are becoming increasingly federated, with more information and integration globally that ever before. Reducing the complexity is essential to a successful approach. Globalization led to a variety of additional problems and complications from the data management perspective. This includes multi-lingual and multi-character set issues, and 24x7 data availability needs driven by global operations. The number of channels enterprises receive and provide information has also grown significantly with the recent evolution of the Internet and voice recognition technologies.
  • Finally, all sides see a major opportunity. MDM is a big, complex problem and is therefore an opportunity for product vendors and systems integrators. New MDM technologies referred to as MDM data hubs have been developed. Even though the data hubs may look like their predecessors Operational Data Stores (ODS), modern data hub technologies are SOA enabled and leverage a number of other modern technologies not commonly used by the old traditional ODS. As the problem is an information management problem, every information management vendor has a “solution”. Application-centric vendors (which started the MDM trend) also see this as a major opportunity to expand their integration and application scope. Organisations with MDM issues are doing a variant of the same approach: they face a variety of challenges in the information management space and this provides them with a collective way to frame the problem. This situation is similar to that which arose with compliance initiatives a few years ago.
  • Compliance initiatives driven by the War on Terror and corporate scandals in the US have put additional pressures on the enterprises. Without a sound MDM solution enterprises are facing increasingly difficult problems to support evolving regulatory requirements

Therefore, the MDM problem is real, the marketplace momentum is significant and all sides are on board while only a few percent of the enterprises globally have implemented mature MDM solutions. So what needs to be done?

Solving the MDM Problem is Not Easy

MDM is inherently challenging. Technology alone will not solve the problem – most of the root causes issues are process and competency-oriented:

  • Organisations typically have complex data quality issues with master data, especially with customer and address data from legacy systems
  • There is often a high degree of overlap in master data, e.g. large organisations storing customer data across many systems in the enterprise
  • Organisations typically lack a Data Mastering Model which defines primary masters, secondary masters and slaves of master data and therefore makes integration of master data complex
  • It is often difficult to come to a common agreement on domain values that are stored across a number of systems, especially product data
  • Poor information governance (stewardship, ownership, policies) around master data leads to complexity across the organisation
MDM Ecosystem.jpg

For each of these areas, organisations often require a major shift from the approach they have taken in the past. Without a well-defined approach, an organisation can expect the same issues they have faced on other large, complex programmes that involve significant integration and information management

The figure on the right illustrates that enterprise MDM-CDI solutions are multidisciplinary. Also the figure shows the disciplines and areas that must be addressed to ensure a holistic approach to enterprise MDM-CDI is taken.

A holistic approach is required to successfully implement these initiatives.

Solution Offering Purpose

This is a Core Solution Offering. Core Solution Offerings bring together all assets in MIKE2.0 relevant to solving a specific business and technology problem. Many of these assets may already exist and as the suite is built out over time, assets can be progressively added to an Offering.

A Core Solution Offering contains all the elements required to define and deliver a go-to-market offering. It can use a combination of open, shared and private assets.

Solution Offering Relationship Overview

The Master Data Management Solution Offering is part of the EDM Solution Group

The MIKE2.0 Solution Offering for Master Data Management describes how the Activities and Supporting Assets of the MIKE2.0 Methodology can be used to deliver successful solutions for managing common master data across a number of systems in the enterprise.

MIKE2.0 Solutions provide a detailed and holistic way of addressing specific problems. MIKE2.0 Solutions can be mapped directly to the Phases and Activities of the MIKE2.0 Overall Implementation Guide, providing additional content to help understand the overall approach.

The MIKE2.0 Overall Implementation Guide explains the relationships between the Phases, Activities and Tasks of the overall methodology as well as how the Supporting Assets tie to the overall Methodology and MIKE2.0 Solutions.

Users of the MIKE2.0 Methodology should always start with the Overall Implementation Guide and the MIKE2.0 Usage Model as a starting point for projects.

Solution Offering Definition

In most large organisations, managing master data is a very complex problem. The MIKE2.0 Information Development approach can help implement the necessary competences to address these issues in a systematic fashion. An end-to-end approach is followed that goes from business strategy to ongoing operational improvement. It involves major changes to people, process, organisation and technology for the comprehensive programme.

The MIKE2.0 Solution for Master Data Management provides techniques that can be used to help solve a very complex problem around integrating federated information. It defines the strategic architectural capabilities as well as the high-level solution architecture options for solving different MDM challenges. It then moves into the set of required Foundation Activities, Incremental Design, and Delivery steps.

Overview of the MIKE2.0 Solution for Master Data Management

MIKE2.0 contains the comprehensive set of activities and tasks that are required to see an MDM programme through from strategy to implementation. Key aspects of MIKE2.0 that are applied include:

  • Definition of the business and technology blueprint, which defines strategic requirements and plans the implementation programme over multiple increments
  • Definition of strategic and tactical architecture frameworks, using the SAFE Architecture as a starting point
  • Documentation of strategic technology requirements for product selection
  • An organisational model to focus specifically on Information Development
  • An approach to improve Data Governance across the environment

This comprehensive approach goes across people, process, organisation and technology and is driven by an overall strategy.

Key Aspects of the MIKE2.0 Solution for Master Data Management

There are different architectural models and techniques used for MDM solutions. The MIKE2.0 Overall Implementation Guide presents the set of Activities that would generally apply across different architectures. Logical Architecture, Design and Development Best Practices and complementary Physical Techniques will be built out over time to support different approaches that can be used.

There are some features to the MIKE2.0 MDM Solution that may differentiate it from other approaches for MDM. These include:

Use of a Common Approach for all Master Data

It is often explained that there is a single Master Data solution for each type of data. In the MIKE2.0 approach, the recommended approach is to try to provide a holistic solution for this issue.

This is not to discount specialty solutions related to Customer or Product Data Integration but to state that integration and information management should be the key enablers, even with more focused, application-centric MDM solutions. A holistic MDM approach encompasses all business entities and information classes comprising MDM while different tools, technologies, and data hub architecture styles can be used across the enterprise to meet business requirements.

Take the Opportunity to Enhance Business Intelligence, Improve Business Processes and Applications

MDM solutions are often perceived by business and executive management as significant and costly purely infrastructure improvement efforts lacking well-defined tangible business benefits. In order to avoid this organisations should make sure to align this work with other initiatives that improve business processes, business intelligence, reporting and analytics, help reduce administrative overhead caused by redundant data entry, and provide other demonstratable benefits.

An Integrated Operational Data Store or a Data Hub, for example, can be used as components of the staging area for the analytical Warehouse as well as a hub between co-existent applications. This allows new business functionality to be delivered on the analytical side in parallel with new operational functionality.

Data Quality Improvement requires more than Technology

Even a very sophisticated MDM technology solution cannot resolve data quality issues if proper standards and governance procedures are not in place. The MIKE2.0 Solution for Master Data Management works in conjunction with solutions for Data Investigation and Re-Engineering and Data Governance to address historical issues, prevent new on-going data quality issues from occurring when possible and provide an enterprise exception processing framework for efficient data processing management.

Overview of MDM Techniques

Migration of Historical Data

An MDM programme will typically involve a migration of historical data across systems, into or through a centralised hub. This is where many of the data quality issues are resolved in a progressive fashion before operationalising some of these rule-sets for the ongoing implementation.

Migration of historical data for MDM

The MIKE2.0 Solution for Data Migration – Medium Migration provides a summary of the key activities should be followed for migrating historical data. In migrating historical data, it is important to align the work that is done for any ongoing integration of master data. Therefore, interfaces, data-re-engineering and the approach to metadata management should closely align for both approaches.

Application Co-Existence

Master Data Management typically involves providing a solution for application co-existence to multiple systems to be run in parallel and share common master data. The integration framework is formulated so the current-state and future-state can work together.

Application Co-Existence for MDM

MDM requires a comprehensive strategy that develops a vision for people, process, organisation and technology. Implementation may be conducted over a multi-year programme and involve a large number of stakeholders and significant technology changes.

The solution architecture for MDM is sophisticated as it typically involves several systems continuing to provide data movement on an ongoing basis in a co-existent application scenario. Data Quality improvement covers an initial batch process and ongoing capability; significant data mapping and rationalisation is needed across multiple systems in the architecture.

Common MDM Architecture Styles

One of the key design decisions for an MDM implementation is what data hub architecture style fits better for a particular initiative. Here are the most common architecture styles for the data hubs.

  • External Reference or Consolidation

The Data Hub maintains references or pointers to all master data (customer, product, etc) records residing in the external systems. The Hub does not contain master data itself.

  • Registry

The Data Hub in addition to references to external data sources, contains a minimum set of attributes of actual data. Even for these attributes the Hub is not the primary master of record. But it provides search and inquiry services.

  • Reconciliation Engine or Coexistence

If the first two hub architectural styles above link the master data records residing in multiple source systems, this architecture style hosts some of the master entities and attributes as the primary master. It supports active synchronization between itself and the legacy systems in the context of these attributes. It serves as the 'best version of the truth'.

  • Transaction Hub

This achitecture style hosts all master data or a significant portion of it and is the primary master for this data. It holds up-to-date transaction data in real-time and presents these data back to the business applications as a system of record in an SOA style.

The level of solution sophistication, the scale of the data migration effort, and the overall project risk and complexity grows from the “slim” architecture styles (external reference and registry) to the full-scale most invasive Transaction Hub implementation where the system of record is migrated from the source systems to the Hub. Sometimes organizations begin with the “slim” versions of the Hub that evolves later to Reconciliation Engine and ultimately to the Transaction Hub design.

Relationship to Solution Capabilities

Relationship to Enterprise Views

Master Data Management will typically relate to the Enterprise Views in the following fashion:

  • Application Development of any new master system or changes to existing masters. It is possible that very few changes will be required from an application perspective and that the primary focus is better integration of existing systems. It is also possible that the project may involve significant application implementation such as the implementation of a customer or product master system.
  • Information Development for modelling, data investigation, data re-engineering and metadata management. Master Data Management also requires advanced Information Development capabilities.

Other than the initial definition of an Application Portfolio and high level business processes for scoping, the focus of MIKE2.0 is across the Technology Backplane of Information Development and Infrastructure Development. External methods to MIKE2.0 should be used for Application Development of any application store that holds Master Data and, to an extent, Infrastructure Development.

Mapping to the Information Governance Framework

Practically all concepts of the Information Governance Solution are used as model for the Master Data Governance.

During the Strategy Blueprint (phase 1) the baseline is established with the organisational quickscan and select the appropriate organisational model based on the current and target maturity level: 

In the following phases the governance model is developed and implemented.

Mapping to the SAFE Architecture Framework

Sophisticated capabilities are typically needed to fulfill MDM application co-existence requirements. The first step is to get Foundation Capabilities in place across the architecture before progressively moving to more sophisticated capabilities such as a Services Oriented Architecture. Enabling Technologies are important to smooth the transition to these more advanced techniques. A number of artefacts from the SAFE Architecture can help define the required component capabilities as part of a comprehensive approach. Business-oriented functionality such as Rules Engines and Business Process Management may also be used.

Mapping to the Overall Implementation Guide

This section describes the 5 phases of MIKE 2.0 methodology as it applies to MDM implementation. BearingPoint's MDM Xpress, a toolkit for practitioners, aligned to the 5 phases is described initially.

Following the overview of MDM Xpress, each phase of MIKE 2.0 methodology is mapped to specific activities pertaining to MDM implementation.

Some content in this section is available for internal users of BearingPoint only.

Overview of MDM Xpress

BearingPoint has developed an internal toolkit called MDM Xpress. MDM Xpress is designed to assist in the sales and delivery of Master Data Management (MDM) solution to various clients worldwide.

The purpose of the toolkit is to accelerate sales and delivery cycles by:

  • Providing a best-of-breed toolkit consisting of templates and a solution guide for practitioners
  • Supporting the “go-to market” strategy for MDM solution
  • Helping accelerate the MDM implementation lifecycle

The toolkit is designed to support activities associated with the following processes by providing collateral and content, thereby assisting practitioners and business development personnel.

  • Sales Activities - Pre-sales discussions, RFP/RFI work and SOW preparation
  • Strategy assessment - Content around MDM strategy, templates & decision points that can be used in a strategy engagement
  • Delivery - Several design components for business and technology architecture, and use cases to be used in MDM implementation and delivery


The toolkit is well aligned to the 5 phases of MIKE 2.0 methodology.

Business Assessment and Strategy Definition Blueprint (Phase 1)

Master Data Management Strategy consists of delivering a future state roadmap for client’s requiring an MDM solution.

The scope includes:

  • Engagement Management – project management, change management, quality management and risk management
  • Current State Assessment - using QuickScan approach
  • Business Requirement Analysis - workshops with key client stakeholders to identify business issues with master data
  • Technical Requirement Analysis - workshops with key client stakeholders to identify technology issues that hamper the delivery of accurate and reliable master data to consumers
  • Gap Analysis - based on business and technical findings and requirements
  • Future State Recommendations - that consists of business, technology and data architecture
  • Roadmap Definition - for attaining the future state

MDM Express Approach


Specific activities include:

  • Conducting Current State Assessment
  • Identifying Future State Requirements
  • Performing Gap Analysis
  • Recommending Future State Architecture
  • Delivering Future State Roadmap (MDM Program)


Our MDM assessment revolves around 5 key Master Data Dimensions, which are foundational components in an MDM implementation. They are:-

  • Governance - Ownership and accountability for master data processes and data elements
  • Data Management - Business process and master data procedures to create and manage master data
  • Quality Assurance - Data quality initiatives, establishment of data quality metrics, conducting quality audits
  • Architecture and Standards - Technology and data architecture issues, establishing data quality standards
  • Technology - Tools for the MDM architecture landscape


Benefits of this approach

  • Shorten the cycle time for current state analysis and future state development by an up-front assessment
  • Demonstrate that we cover all areas of the project’s scope, but establish priorities early in the process:
    • Assess the current process maturity levels along key MDM dimensions to determine organizational capability and to assess capacity for change
    • Determine the desired future state requirements
    • Define scope, resources and timelines for the MDM program
  • Reach a focused set of “Summary of Observations” or “Opportunities for Improvements” for the client to explore in greater detail as part of MDM program implementation
  • Mitigates two major program/project risks:
    • pending an extended amount of time assessing the current state
    • Missing business or IT specific areas of coverage

Technology Assessment and Selection Blueprint (Phase 2)

All activities from the Technology Blueprint phase of MIKE2.0 will be required for MDM, which may involve implementation of a number of new technologies.

In phase 2 of MIKE2.0, a diligent approach is applied to establish the technology requirements at the level needed to make strategic product decisions. Once inline with the overall business case, technology selection can then take place during this phase. Before implementing these technologies, standards are put in place related to the SDLC process before implementing the initial baseline infrastructure.

Also in phase 2, the Data Governance activities move from establishing the initial organisation to determining how it will function. The strategic set of standards, policies and procedures for the overall Information Development organisation are first established during this phase. The goal is to move to an Information Development Organisation that has established reporting lines into the other aspects of the organisation from a management, architecture and delivery perspective.

The overall target of this phase is to bring the general strategy into a stage so that specific implementation projects can be planned and brought to execution. Practically spoken this means:

Governance: The roles and responsiblities are defined to a degree that the requirements for executive sponsorship, business involvement and IT involvement are clarified and approved.

Data Management: the processes for master data maintenance are designed and scoped to the degree that functional requirements for architecture and technology are defined.

Architecture and Standards: A data inventory is established and scoped so that the data domains (eg. customer, vendor, organization data) that are in scope are identified and classified for global, regional or local ownership. Also the relevant conceptual landscape (producer - consumer map) is defined and basic data models are defined. This is consolidated into the functional technology requirements.

Quality Management: Quality Management processes are defined and Quality Measurments are defined to the point that KPI and reporting requirements are established for the technlogy and tool selection.

Tools and Systems: The existing system landscape is evaluated in terms of its relevancy to the master data processes (source systems, consuming systems etc.). This will result in the integration requirements. Also the non-functional requirements (technology platforms, development standards, operations requirements) are collected.

All the requirements are brought togehter into the technology and tool selection activity in order to select the appropriate technology to support the collected requirements.

The result of this phase is therefore a scope for the MDM initiative with an updated (validated) business case based on clear tool and technology recommendation.

Roadmap and Foundation Activities (Phase 3)

The purpose and objective of Roadmap and Foundation Activities are to define the scope and size of individual work packages or increments for the MDM initiative and put them onto a logical timeline. In order to achieve this, the results of the Phase 2 - mainly the high level design deliverables from the requirements gathering - are detailed further.

The Roadmap and Foundation Activities provide some of the most critical activities for reducing the risk of the MDM programme and providing an integrated conceptual design across multiple solution areas. Therefore, some of the key activities from MIKE2.0 include:

Metadata Management

Due to the complexity of the MDM effort, significant metadata artefacts are produced related to data definition, business rules, transformation logic and data quality. This information should be stored in a metadata repository; getting this repository in place from the early stages of the project during the Metadata Driven Architecture is a key aspect of the architectural approach to MIKE2.0.

In practical terms this means, that the data inventory, which may have been during the previous phases one or more project documents (spreadsheets, textdocuments, presentations) are transformed into a regular Metadata repository. The scope and size of the metadata repository will depend on the maturity level of the organization and the complexity of the overall enterprise landscape. If for example, the overall enterprise architectures is centered around standard ERP software, such vendors provide administration tools that contain metadata management models which can be utilized. In other cases, specific Metadata tools or intranet solutions may be acquired or developed.

The results of the Metadata Management activities for an MDM project according to the 5 dimensions are:


Technology and Systems

  • The master data objects identified and described on system level
    Deliverable: Master Data Inventory

Architecture & Standards:

  • Key attributes (eg. Numbering Schemes, Identifiers, Units etc.) for the Objects are identified and described
    Deliverable: Master Data key standards (part of master data inventory)

Governance:

  • There is a governance process defined for managing changes to metadata
    Deliverable: Process Description with Roles and Responsibilities
  • The data objects and key attributes are mapped to the defined governance roles and responsibilities (Governance) e.g. Who shall be responsible for the definition of a sellable product?
    Deliverable: Updated Data Inventory with Roles and reponsibilites

Data Management:

  • Business Rules are established for the generation and transformation of key attributes throughout the data lifecycle. (Processes). E.g. how is the assignement of a customer number managed in the transition from a prospective customer to a logistical and financial relevant customer?
    Deliverable: Master Data key concepts
Data Profiling

Data Profiling conducts an assessment of actual data and data structures and is focused on measuring data integrity, consistency, completeness and validity. As part of this process, data quality issues are identified at the individual attribute level, at the table-level and between tables. Metadata such as business rules and mapping rules are identified as a by-product of this process. It is important to conduct Data Profiling early in the MDM programme to reduce risks of project failures due to data quality issues. Data profiling rules are also operationalised to monitor data quality over time.

The data profiling activities are tightly intertwined with the Metadata activities: Without a definition of the objects in scope and the key attributes, there is no foundation existing for profiling. On the other hand, the profiling itself will uncover both the existing de-facto standards and its adherence to them. This may seem to be a banality, but considering the fact that any enterprise can be expected to have a sizable number master data objects with as many variations of definitions as there are systems and business units, it becomes obvious that profiling without underlying concepts will become inefficient and likely result in non-actionable profiling results due to the sheer amount of possible issues and findings.

As master data is also always present in enterprises and systems – otherwise it would be impossible to operate any processes – the profiling will uncover more or less documented practices for standards that exist inside the enterprise. Also due to the mostly grown nature of such standards, profiling will put a factual mirror against the adherence to such more or less formal standards.

A practical approach towards a combined Metadata and Profiling exercise will follow a top down x-step iteration:

  1. Metadata: Conceptual definition of objects
  2. Profiling: Identify Systems, Tables, number of records, Keys, Table Structures
  3. Metadata: Conceptual definition of key attributes, Standard definition
  4. Profiling: Profiling of key attributes against standard.
  5. Metadata: review standards against practice, derive business rules for standards

The main deliverables of the data profiling activities in the Roadmap and foundation phase are: 
Data Management: Practices for Data profiling, Data quality measuring 
Architecture & Standards: Business Rules for Data Standards,
Technology & Systems: Queries and Reports for data quality measurements
Quality Management: Data quality measurements and baselines

Data Re-Engineering

Data Re-Engineering is used to standardise, correct, match, de-duplicate and enrich data across systems. For most MDM efforts, some level of re-engineering is required; re-engineering can make up a significant percentage of the work on some efforts. The process for Data Re-Engineering typically follows the “80/20 rule”, using a repetitive software development lifecycle until data reaches the level that provides the most business value. For historical data, this process often involves moving data into a staging area and re-engineering data in an iterative fashion before finally loading it into a production target. Some of these rules may be operationalised to run in an ongoing fashion.

The data re-engineering activities in an MDM project will directly build up on the results of the profiling and metadata management activities. Once you have identified what you want (the standard) and what is the gap (profiling)
During the Roadmap and Foundation phase, the data re-engineering will mainly focus onto the assessment and planning of the Re-engineering.efforts. Due to the fact that data re-engineering can easily become a major effort and also the impact of the data re-engineering to dependant data and downstream systems can become substantial, it is important to size and scope these efforts. This may include even evaluation and design of specific data cleansing and enrichment tools, that need to be included into the roadmap and design phase.

The deliverables of Data Re-engineering during the Roadmap and Foundation phase are:

Governance: Roles and responsibilities for Data cleansing, Data cleansing Guidelines

Data Management: Data Cleansing processes

Architecture & Standards: Standards and Business Rules for Data Cleansing and re-engineering

Quality Management: Definition of measurable Quality criteria for “Good data”

Technology and Systems: Requirements and evaluation of Data Cleansing and Enrichment tools

Data Modelling

The data modelling process is used out the data stores of any new systems in the MDM environment. Sometimes an intermediary data store is built to bring data together from multiple systems in a hub fashion. This data store provides a common, integrated model where data may undergo significant re-engineering.

Based on the selected MDM model (Transaction Hub, Registry) the data model is detailed further. Essential for the roadmap and foundation activities are to consolidate the results of the metadata and profiling actifities into a validated data model that will serve as baseline for the scoping of the design and implementation. Frequently the data model will be derived from a standard vendor model. While such an approach will reduce the overall effort to design a completely customized data model, the results of the metadata and profiling activities should be considered carefully against any standard data model in order to assess the size and scope of deviations from the real life situation against the data model.

Enterprise Information Architecture

MIKE2.0 takes the approach of building out the Enterprise Information Architecture over time for each new increment that is implemented as part of the overall programme. The scope for building the Enterprise Information Architecture is defined by the in-scope Key Data Elements (KDEs) that must be integrated against different systems in the MDM architecture.

Solution Architecture Definition

Due to the complexity of the implementation, the Solution Architecture Definition/Revision must incorporate advanced techniques such as definition of a Services Oriented Architecture and Integrated Operational Data Store. Some architectural techniques that may be employed, such as Active Metadata Integration, may have a significant impact on the overall development approach. The design for testing this will also need to be sophisticated and should also be incorporated into the Solution Architecture.

Prototype the Solution Architecture

Due to the solution complexity, Prototyping the Solution Architecture can be an effective mechanism for testing the conceptual design. For an MDM initiative it can be particularly effective for testing complex areas such as application co-existence, the functionality of the integrated Operational Data Store and ongoing Data Quality Monitoring.

Design Increment (Phase 4)

All design activities of MIKE2.0 will be required for MDM. The Data Integration process for MDM initiatives is generally complex enough to warrant a comprehensive design process that covers conceptual, logical and physical design. These integration components are then constructed during Technology Backplane Development.

A Services Oriented Architecture Design should typically be employed due to the ongoing nature of the interfaces and business capabilities that will be built. It will involve the delivery of Interface Services, Business Services and Data Management Services implemented in a fashion so that they are discoverable from a common repository.

The MIKE2.0 approach to MDM recommends that a Business Intelligence environment be built in parallel with the delivery of the operational systems integration. Therefore the Business Intelligence Design activities from the Overall Implementation Guide should be conducted as this stage.

Incremental Development, Testing, Deployment and Improvement (Phase 5)

Development of the Technology Backplane and Business Intelligence Application will occur during this phase. The earlier activities around Solution Architecture and Incremental Design feed directly into the build process.

Testing

Testing MDM Solutions is very complex. This complexity is commonly underestimated. In addition to Functional Testing, End-to-End Testing and UAT, System Integration Testing and SVT will also be required. Testing needs to be performed against historical data loads as well as against the ongoing feeds of data into the system.

Deployment and Final Verification

Data is loaded into the production system where some further data quality cleanup may be required. Production Verification Testing is conducted, which should also include functional testing of features that are environment specific. After testing is complete, the system is activated as a live production system. Ongoing activities will be required to monitor all systems in the architecture that share common data.

Continuous Improvement

The Continuous Improvement activities will be important for MDM due to the complexity of the programme and its long-running nature. Of particular importance will be the Continuous Improvement of Data Quality and Infrastructure.

Mapping to Supporting Assets

Strategy Blueprint Definition

To support the phase 1 of the strategy blueprint a number of conceptual artefacts support the definition and clarification of the MDM blueprint:

Logical Architecture, Design and Development Best Practices

A number of artefacts help support the MIKE2.0 Solution for MDM:

The following MIKE2.0 Solutions should also be referenced:

Complementary MIKE2.0 Solutions related to IT Transformation and Data Migration may also prove useful.

Product-Specific Implementation Techniques

IBM DataStage (former Ascential)

SAP Netweaver MDM

Product Selection Criteria

The product selection criteria are derived from 3 sets of requirements:

  • Data specific functional requirements: The data specific requirements are derived from the master data object eg: material, customer, supplier, employee, chart of accounts etc.
  • General functional requirements: General functional requirements are relevant for all master data objects and include like workflow capabilities, ID management capabilities, lifecycle status management capabilities aso. 
  • Non-functional requirements: For this the deliverable of the strategic non-functional requirements is used. In order to ensure the integration of the MDM products into the Enterprise Information Architecture, the non-functional requirements are essential to avoid a standalone solution.

Estimating Project Complexity

Estimating project complexity for an MDM initiative is quite challenging, as there are many factors influencing the efforts and risks.
There is a complexity estimation model for MDM that groups the dimensions "governance complexity", "architecture complexity" and "data complexity".
In addition the complexity estimation model for data migration can be used for estimation of specific master data objects. 

Relationships to other Solution Offerings

Similar MIKE2.0 Solutions include:

  • The MIKE2.0 Solution for Data Migration provides an approach for conducting different types of migration programmes, from the simple to the complex. The solution approach for a complex, application co-existence scenario is similar to that for Master Data Management

Used in conjunction with these Solutions, the MIKE2.0 Solution for Master Data Management provides an approach that can be used to help solve a very complex business problem.

Extending the Open Methodology through Solution Offerings

Listed below are proposed extensions to the Core MIKE2.0 Methodology to meet the requirements for Reference and Master Data Management:


Integration to Business Processes

The MDM Solution Offering is focused strictly on the management of information. A more complete solution offering for MDM may focus more extensively on the business process and application aspects of the solution. Examples may include:

Product Data Management

For product or item data management, there are several key business processes that can be tied to the Master Data Management solution:

  • The first is the product lifecycle management process which is highly industry specific. 
  • A more generic item management process is the integration of the item master into the sourcing and procurement processes. 
  • A third key business process is the integration of product master data into sales and marketing processes especially with the focus on managing the definition and distribution of the product offering towards channels, accounts via catalogs.

Vendor Data Management

  • Explicity tie the Vendor Master data management to the key business processes in sourcing and procurement. Key elements here are the harmonization of Vendor Master data across the different procurement processes like strategic sourcing, supplier relationship management, purchasing and goods receipt, accounts payable.
  • The combination of vendor master data management with the business intelligence solution offering results in business solutions for strategic supplier management, spend analysis and management.

Customer Data Management

  • The extension of the master data management solution for customer master data is described in the solution offering customer data integration

Employee Data Management

  • Explicitly tie the Employee data Management to the employee lifecycle process. 
  • Especially focus on the integration aspects with access, monitoring and control solution as described in MIKE2.0 methodology for managing information assets.
  • The tying of the employee master data management to the employee lifecycle (hiring, transfer, promotion, separation) allows integration of the master data to issues like identity management, user provisioning and access control and process control and compliance.

Potential Activity Changes

Wiki Contributors
Collapse Expand Close

View more contributors