Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Wiki Home
Collapse Expand Close

Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Information Governance Solution Offering

From MIKE2.0 Methodology

Share/Save/Bookmark
Jump to: navigation, search

Hv3.jpg This Solution Offering currently receives Major Coverage in the MIKE2.0 Methodology. Most required activities are provided through the Overall Implementation Guide and SAFE Architecture, but some Activities are still missing and there are only a few Supporting Assets. In summary, aspects of the Solution Offering can be used but it cannot be used as a whole.
A Creation Guide exists that can be used to help complete this article. Contributors should reference this guide to help complete the article.

Contents

Introduction


The Information Governance Solution Offering (IGSO) provides a comprehensive approach to improve Information Governance (IG).  IGSO endeavors to improve how information is managed across the organization. To this end, IGSO includes staff skill sets, policies, procedures, processes, organizational structures, and technology. IGSO is one of the main foundations of the MIKE2.0 Methodology.

Executive Summary

There are varying definitions of the term "Information Governance" (IG). To be sure, there is consensus that IG includes Data Quality Management (DQM). However, it is difficult to arrive at a consensus for three primary reasons:

  • Data Quality (DQ) is a complex topic involving far more than just the accuracy of data. DQ is typically measured across seven quantitative dimensions and a number of qualitative ones. What's more, composite data management issues such as referential integrity problems could also be considered data quality issues.
  • Data Quality Management (DQM) involves more than just addressing historical data quality issues through data profiling and re-engineering. It involves being proactive--i.e., preventing these issues from occurring in the first place. Issue prevention is complex and sometimes involves changes to source systems, business processes, and technology. Finally, issues for some systems or users may not present a problem for others.
  • Data Governance (DG) is about much more than managing DQ. DG sometimes encompasses covering a collection of best practices around the management of information. These include the ability: to secure data, to provide real-time access to data, and to deal with complex integration issues. Organisational efficiency and agility also fall under the DG umbrella.

The MIKE2.0 Methodology embraces this broader definition of DG, ultimately refering to this overall approach as Information Development. We believe that organizations have traditionally not focused sufficiently on this area. As a result, they face many of the problems that they do today. MIKE2.0 allows for the implementation of a comprehensive DG programme designed to address the business problems that, at their core, are data management problems.

Why is a Comprehensive Approach Required for Data Quality Improvement?

Comprehensive data governance programme.jpg
Despite--or perhaps because of--the tremendous cost of DQ issues, most organisations are struggling to address them. We believe there are five primary reasons that they are failing:
  • Our systems are more complex than ever before. Many companies now have more information than ever before. This requires greater integration. New regulations, M&A activity, globalisationm, and increasing customer demands collectively mean that IM challenges are increasingly--both in numbers and in terms of complexity
  • Silo-ed, short-term project delivery focus. Many projects are often funded at a departmental level. As such, they typically don’t account for the unexpected effects of how data will be used by others. Data flows among disparate systems--and the design of these connection points--must transcend strict project boundaries.
  • Traditional development methods do not place appropraite focus on data management. Many projects are focused more on functionality and sfeature than on information. The desire to build new functionality--for the sake of new functionality--often results in information being left by the wayside.
  • DQ issues are often hidden and persistent. Lamentably, DQ issues can remian unnoticed for some time. Ironically, some end-users may suspect that the data in the systems on which they rely to make decisions are often inaccurate, incomplete, out-of-date, invalid, and/or inconsistent. This is often propagated to other systems as organizations increase connectivity. In the end, many organisations tend to underestimate the DQ issues in their systems.
  • DQ is not fit for purpose. Many DQ and IM professionals know all too well that it is difficult for end-users of downstream systems to improve the DQ of their systems. While the reasons vary, perhaps the biggest culprit is that the data is entered via customer-facing operational systems. Often these clerks do not have the same incentive to maintain high DQ; they are often focused on entering  data quickly and without rejection by the system at the point of entry. Eventually, however, errors become apparent, as data is integrated, summarized, standardized, and used in another context. At this point, DQ issues begin to surface.

A comprehensive DG programme must be defined to meet these challenges.

Why is a New Competency Model Required?

Many organisations have struggled to meet these challenges for one fundamental reason: they fail to focus on the enterprise-wide nature of data management problems. They incorrectly see information as a technology or IT issue, rather than as a fundamental and core business activity. In many ways Information is the new accounting. Solutions required to address complex infrastructure and information issues can't be tackled on a department-by-department basis. 

While necessary, defining an enterprise-wide programme, on the other hand, is also very difficult. Building momentum for these initiatives takes a long period of time. Further, it can easily lead to approaches out-of-sync with business needs. Attempts to enforce architectural governance, for example, can quite easily become ineffectual or a "toothless watchdog" providing value.

Organisations require an approach that can address all of the inherent challenges of

  • a federated business model
  • an often complex technology architecture

Fundamentally, this approach should be both effective and conducive to innovation. Admittedly, this is not an easy task. This is the rationale for MIKE2.0 and the need for a new competency of Information Development.

Moving from Data Governance to Information Governance

Because IG is a relatively new field, practitioners have focused on what they know. Typically, this focus has been on structured data. Early adopters have admittedly been wary of the more ambiguous aspects of governance. These include: 

  • Information Lifecycle Management (ILM)
  • Accountability
  • Monitoring
  • Information Return on Investment (ROI) management

MIKE2.0 has initially focused on an improved approach to DG. The ultimate objective is to extend this solution more broadly to encompass other forms of content. Through this collaborative method, Contributors are encouraged to help develop this solution in an emerging area.

The target scope for this offering is defined in the Solution Offering Definition section

Solution Offering Purpose

This is a Core Solution Offering (CSO) and it brings together all assets in MIKE2.0 that pertain to solving a specific business and technology problem. Many of these assets may already exist. What's more, as the suite is built out over time, assets can be progressively added to an Offering.

A CSO contains all the elements required to define and deliver a go-to-market offering. It can use a combination of open, shared, and private assets.

The Information Governance Offering is also a Foundational Solution. Foundational Solutions are "background" solutions that support the Core Solution Offerings of the MIKE2.0 Methodology. They are the lowest assets within MIKE2.0 that of a comprehensive nature. They may tie together multiple Supporting Assets. Finally, they are referenced in the Overall Implementation Guide and via other Solution Offerings.

Solution Offering Relationship Overview

Mike2 solution groups isag.jpg
The MIKE2.0 Solution Offering for Information Governance describes how the Activities and Supporting Assets of the MIKE2.0 Methodology can be used to deliver successful solutions for managing common master data across a number of systems in the enterprise.

MIKE2.0 Solutions provide a detailed and holistic way of addressing specific problems. Theycan be mapped directly to the Phases and Activities of the MIKE2.0 Overall Implementation Guide. To this end, they provide additional content to help users understand the overall approach.

The MIKE2.0 Overall Implementation Guide explains:

  • the relationships between the Phases, Activities, and Tasks of the overall methodology
  • how the Supporting Assets tie to the overall Methodology and MIKE2.0 Solutions.

Users of the MIKE2.0 Methodology should always start with the Overall Implementation Guide. The MIKE2.0 Usage Model also provides a good starting point for projects.

Solution Offering Definition

Listed below are the most important factors to a successful Information Governance (IG) programme. This definition provides the target scope for this solution offering, which also includes unstructured content.

  • Accountability. Because of the ways in which information is captured--and how it flows across the enterprise, everyone has a role to play in how it is governed. Many of the most important roles are played by individuals fairly junior in the organization. They typically play a key role data capture stage and often cause--or see-errors on a first-hand basis. Certain individuals need to be dedicated to IG. These roles are filled by senior executives such as the CIO, Information Architects, and Data and Content Stewards.
  • Efficient Operating Models. The IG approach should define an organizational structure that most effectively handles the complexities of both integration and IM across the whole of the organization. Of course, there will typically be some degree of centralization as information flows across the business. However, this organizational model need not be a single, hierarchical team. The common standards, methods, architecture, and collaborative techniques so central to IG allow this model to be implemented in a wide variety of models: physically central, virtual, or offshore. Organizations should provide assessment tools and techniques to progressively refine these new models over time.
  • A Common Methodology. An IG program should include a common set of activities, tasks, and deliverables. Doing so builds specific IM-based competencies. This enables greater reuse of artifacts and resources, not to mention higher productivity out of individuals. It also manifests the commonalities of different IM initiatives across the organization.
  • Standard Models. A common definition of terms, domain values, and their relationships is one of the fundamental building blocks of IG. This should go beyond a traditional data dictionary. It should include a lexicon of unstructured content. Defining common messaging interfaces allows for easy inclusion of “data in motion.” Business and technical definitions should be represented and, just as important, the lineage between them easy to navigate.
  • Architecture. An IM architecture should be defined for the current-state, transition points, and target vision. The inherent complexity of this initiative will require the representation of this architecture through multiple views. This is done in Krutchen’s Model. Use of architectural design patterns and common component models are key aspects of good governance. This architecture must accommodate dynamic and heterogeneous technology environments that, invariably, will quickly adapt to new requirements.
  • Comprehensive Scope. An IG approach should be comprehensive in its scope, covering structured data and unstructured content. It should also include  the entire lifecycle of information. This begins with its initial creation, including  integration across systems, archiving, and eventual destruction. This comprehensive scope can only achieved with an architecture-driven approach and well-defined roles and responsibilities.
  • Information Value Assessment (IVA). Organizations (should) place a very high value on their information assets. As such, they will view their organization as significantly de-valued when these assets are unknown--or poorly defined . An IVA assigns an economic value to the information assets held by an organization. The IVA also how IG influences this value. It must also measure whether the return outweighs the cost, as well as the time required to attain this return. In this vein, current methods are particularly immature, although some rudimentary models do exist. In this case, industry models must greatly improve, much like what has occurred in the past ten years in the infrastructure space.
  • Senior Leadership. Senior leaders to manage their information, and deal with related issues. CIOs, for example, must face a host of business users who increasingly demand relevant, contextual information. At this same time, leadership teams often blame failures on "bad data." In the post Sarbanes-Oxley environment, CFOs are asked to sign off on financial statements. To this end, the quality of data and the systems that produce that data are being scrutinized now more than ever before. CMOs are being asked to grow revenues with less human resources. New regulations around the management of information have prevented many organizations from being effective. Senior leaders must work towards a common goal of improving information while concurrently appreciating that IM is still immature as a discipline. The bottom line is that there will be some major challenges ahead.
  • Historical Quantification. In the majority of cases, the most difficult aspect of IM can be stated very simply: most organizations are trying to fix decades of of “bad behavior." The current-state is often unknown, even at an architectural or model level. The larger the organization, the more complex this problem typically becomes. Historical quantification through common architectural models and quantitative assessments of data and content are key aspects of establishing a known baseline. Only then can organisations move forward. For such a significant task, this assessment must be conducted progressively--not all at once.
  • Strategic Approach. An IG program will need to address complex issues across the organization. Improvements will typically be measured over months and years, not days. As a result, a strategic approach is required. A comprehensive program can be implemented over long periods of time through multiple release cycles. The strategic approach will allow for flexibility to change. However, the level of detail will still be meaningful enough to effectively deal with complex issues.
  • Continuous Improvement. It is not always cost-effective to fix all issues in a certain area. Sometimes, it is best instead follow the “80/20 rule. An IG program should explicitly plan to revisit past activities. It should build on a working baseline through audits, monitoring, technology re-factoring, and personnel training. Organizations should look for opportunities to “release early, release often." At the same time, though, they should remember what this means from planning and budgeting perspectives.
  • Flexibility for Change. While an IG program involves putting standards in place, it must utilize its inherent pragmatism and flexibility for change. A strong governance process does not mean that exceptions can’t be granted. Rather, key individuals and groups need to know exceptions are occurring--and why. The Continuous Improvement approach grants initial workarounds. These then have to be re-factored at a later point in order to balance short-term business priorities.
  • Governance Tools. Measuring the effectiveness of an IG program requires tools to capture assets and performance. Just as application development and service delivery tools exist, organizations need a way to measure information assets, actions, and their behaviors.

These capabilities are implemented across the five implementation phases.

Relationship to Solution Capabilities

This Solution Offering maps into the Solution Capabilities of MIKE2.0 as described below.

Relationship to Enterprise Views

The MIKE2.0 Solution for Information Governance covers all areas of Information Development, across people, process, organisation, technology and strategy. It is a key enabler to delivering an Information Management competency and moving to model of a sophisticated Information Governance Organisation.

Mapping to the Information Governance Framework

This Solution Offering provides the Information Governance Framework for MIKE2.0 and is a Foundational Solution for MIKE2.0 as well as being a go-to-market offering. Foundational Solutions are used to support all Solution Offerings across the MIKE2.0 Methodology.

Mapping to the SAFE Architecture Framework

Developing a common architectural framework is an important part of a holistic approach to Information Governance. The SAFE Architecture provides a complementary Foundational Solution to the Information Governance Framework. Key aspects of the architecture in relation to this offering include:

  • A systematic approach that goes from building the blueprint conceptual architecture to an incremental Solution Architecture
  • A detailed methodology for product selection, design and construction
  • Defines a standards-based, services-driven architecture
  • Provides an approach that allows capabilities to be delivered progressively

Through this approach a consistent architecture can be defined and implemented over time that complements the people, process, and organisational aspects of Information Governance.

Mapping to the Overall Implementation Guide

There are a number of aspects that make up the MIKE2.0 approach to improving Information Governance and the operating model for how it is delivered. It crosses the 5 phases of the overall approach. The most critical activities and how they relate to improving Information Governance are described briefly below:

Business Assessment and Strategy Definition (Phase 1)

Improving Information Governance as part of MIKE2.0 is initiated from the onset of the programme, during the definition of the Business Blueprint. Phase 1 Activities for improving Information Governance include assessment of the current-state environment and establishing the initial Data Governance team.

Business Strategy for Overall Information Development

For Information Governance the Overall Business Strategy for Information Development is required but this is normally done within the context of an Information Management Strategy engagement. If it has not been done at a strategic level it should be done as part of this programme.

Organisational QuickScan

The Organisational QuickScan for Information Development is about trying to quickly understand the organisation’s current environment for Information Governance and to begin to establish the vision for where it would like to go throughout the programme. This means that some of the key tasks within this Activity involve capturing the current-state set of practices around Information Governance, which are often poorly documented. As MIKE2.0 uses a broad definition of Information Governance, this assessment process involves People, Process, Organisation and Technology. QuickScan assessments are a core part of this activity as they not only provide a rich starter set of questions but also provide maturity guidelines for organisations. The gap between the current-state assessment and the envisioned future-state gives an early indicator of the scope of the overall Information Governance programme.

Data Governance Sponsorship and Scope

In order to conduct a successful Data Governance programme, it is important to have sponsorship at senior levels. Data Governance Sponsorship and Scope is focused on defining what this initial scope will be for improved Data Governance, based on the high-level information requirements and the results of the organisational assessment. This leadership team will play an ongoing role on the project.

Initial Data Governance Organisation

The Initial Data Governance Organisation is focused on establishing the larger Data Governance Organisation. Roles and Responsibilities are established and the overall Organisational structure is formalised. Communications models for Data Governance are also established, which become a critical aspect of issue resolution and prevention further down in the implementation process. The Data Governance Organisation that is established at this point will become more sophisticated over time. The continuous implementation phases of MIKE2.0 (phases 3,4,5) revisit organisational structure for each increment and there are specific improvement activities around moving to an Information Development Organisational model in Phase 5.

The Information Management Governance organisation should review all roles associated with Information Management along with the required skills for each role and the acitivities performed by these roles. The Information Management Roles, Skills and Activities shows typical roles associated with a generic Information Management setup. While the list of roles is extensive, it is not necessarily an exhaustive list. Every organisation will have special requirements. In addition, a role does not equate to a unique physical person. One person can handle multiple roles, especially within smaller organisations. More on that later. Also, a specific role can be performed by many people across the organisation. Skills and Activities associated with each role are also flexible and each organisation can use this table as a guideline and then customize it to their organisation.

Future State Vision for Information Management

Defining the overall Information Management Architecture and Standards are an important part of Information Governance. If this is not done within another programme, it should be done as part of an Information Governance programme by defining the Future State Vision for Information Management. The strategic logical and physical architectures may also be reviewed in later activities.

Return on Investment of Information Assets

The Return on Investment of Information Assets activity is under development. It will be an important activity in the Information Governance Solution Offering but measuring information value and providing and overall business case for Information Management.

Programme Review

The Programme Review activity is important for assessing that the Information Management programme is aligning with overall strategy and the methodological approach. It should be conducted on a periodic basis.

Technology Assessment and Selection Blueprint (Phase 2)

From a Data Governance perspective, two of the key Activities involve the development of policies and standards that will be used as part of the implementation phases of the project. As part of the Continuous Improvement approach introduced in Phase 5, audits will be conducted to enforce the use of standards and policies and communication will be used as the basis of improved culture. It is also at this point that we begin capturing information into a structured metadata repository. Metadata is captured when information assets are designed and built and if key for supporting a common understanding of the information assets. Data usage and dependencies are also identified in the metadata, to support change impact assessments and to fully reveal the business usage of information assets.

Data Policies

Data Governance Policies are derived from the policies and guidelines developed in Phase 1. These high-level policies impact the definition of Data Standards, in particular data security, normalisation and auditing practices.

Data Standards

Data Standards are an important part of Data Governance as standards take complexity out of the implementation process though common language, term definitions and usage guidelines. The standards should be established before the implementation teams begin any detailed work. This will make sure that the team is using a common set of techniques and convention and working within the overall policy framework for Data Governance. As part of an overall Data Governance programme, standards are typically developed for:

  • Data Specification
  • Data Modelling
  • Data Capture
  • Data Security
  • Data Reporting

Data Standards should be straightforward and follow a common set of best practices. Oftentimes, Data Standards will already exist that can be leveraged and extended.

Metadata Driven Architecture

The strategy, design and implementation of a Metadata Driven Architecture is a key part of the approach to improving Data Governance. Metadata management is developed across multiple activities in MIKE2.0, eventually maturing to a more active approach to metadata integration. The metadata management architecture and supporting development practices are a critical aspect of MIKE2.0, evidenced by the metadata architecture overlay of the SAFE Architecture that goes across all components in the architecture.

It is important to get metadata management practices in place from the onset. As Phase 2 involves strategic technology requirements and product selection, the implementation team may lack the tools with which they plan to strategically manage metadata and ideally move to a metadata-driven environment. Therefore, projects will often need to take a tactical approach in the early stages. Regardless of whether a product or platform has been selected yet, MIKE2.0 recommends some form of repository and base meta-model be in place from the onset. MIKE2.0 provides a starter set model for metadata management that encapsulated much of the core metadata that we want to capture. During the Blueprint phase, this model can be used to collect information that historically would have gone into documents or spreadsheets.

Roadmap and Foundation Activities (Phase 3)

The Foundation Activities of MIKE2.0 are arguably the most important aspects of the overall methodology for improving Data Governance. The focus in implementing the Foundation Activities is around those Key Data Elements that are deemed the most crucial to the business.

Business Scope for Improved Data Governance

Definition of the Key Data Elements as part of the Business Scope for Improved Data Governance is a key part of the MIKE2.0 approach to Data Governance. KDEs help focus the work to be done to the most critical data that impacts business users. Data valuation then assigns value to KDEs that are used to prioritize the scope of the Data Governance program. The Data Governance approach focuses primarily on these KDEs for each increment.

Enterprise Information Architecture

Most organisations do not have a well-defined Enterprise Information Architecture. MIKE2.0 takes the approach of building out the Enterprise Information Architecture over time for each new increment that is implemented as part of the overall programme. The scope for building the Enterprise Information Architecture is defined by the in-scope Key Data Elements (KDEs). The Enterprise Information Architecture includes the model to support these KDEs, what systems they resides in, the mastering rules for this data and how often it is to be mastered.

Root Cause Analysis of Data Governance Issues

Preventing Data Governance issues involves analyzing those process activities or application automation that prevents Data Governance issues from occurring in the first place. Root Cause Analysis of Data Governance Issues is concerned with addressing root cause issues as opposed to addressing the symptoms. In many cases, the root causes are organisational and process-related rather than technical in nature. Some of the typical root causes include lack of senior management commitment, lack of appropriate organisational structure to support governance, non-compliance of governance processes even if the processes exist etc.

Data Governance Metrics

Data Governance Metrics include the data quality metrics to be measured for the KDEs. Baseline (current state) metric levels should be assessed and suitable target levels for improvement defined. Each KDE is measured against the defined metric category through the appropriate measurement technique.

In addition to KDE quality metrics, consideration should be given to:

  • Metrics for the overall effectiveness of the data governance programme
  • Compliance metrics that measure adherence to data governance policies, processes and data standards
Data Profiling

Data Profiling is used to quantitatively identify data quality issues. This is a key aspect to improving Data Governance as it provides clear results on actual data. MIKE2.0 recommends Data Profiling be conducted frequently as an approach to improve overall Data Governance.

Data Re-Engineering

Data Re-Engineering helps improve Data Governance by dealing with historical Data Quality issues that are typically identified in Data Profiling. MIKE2.0 recommends that Data Re-Engineering follows a serial process of standardisation, correction, matching and enrichment but that this process be conducted iteratively, following the "80/20 rule". This provides a model for improving Data Governance in the most cost-effective and expedient fashion.

Develop, Test, Deploy and Improve Activities (Phase 5)

The latter Activities of Phase 5 are focused on Continuous Improvement of the overall Data Governance processes, technology environment and operating model.

Continuous Improvement – Compliance Auditing

Continuous Improvement - Compliance Auditing are conducted by an external group as opposed to the internal Data Governance team. Audits don’t involve the technical aspects of data analysis (i.e. data profiling), but instead involves inspection of results and looking at overall processes for Data Governance.

Continuous Improvement – Standards, Policies and Processes

Continuous Improvement - Standards, Policies and Processes revisits the overall set of standards, metrics, policies and processes for Data Governance. Recommended changes feed into the next increment of work as part of the continuous implementation approach of the MIKE2.0 Methodology.

Continuous Improvement – Data Quality

Continuous Improvement - Data Quality involves identification of root causes and ongoing Data Quality monitoring. This allows a far more proactive approach to Data Governance, whereby organization can either address issues quickly or stop them from occurring altogether.

Continuous Improvement – Infrastructure

Continuous Improvement - Infrastructure environment involves closely monitoring the current-environments and instituting tactical changes that are inline with the strategic vision of improved Data Governance.

Continuous Improvement – Information Development Organisation

MIKE2.0 recommends an Information Development Organisation as the most mature organisational model for improving Data Governance in the most efficient and effective fashion. Using the Information Maturity Model first introduced in Organisational QuickScan Activity, the Continuous Improvement - Information Development Organisation approach progressively moves the organisation to the optimal approach for Data Governance.

Mapping to Supporting Assets

Improving Information Governance should go across people, process, organisation and technology. In addition to following the relevant Activities from the Overall Implementation Guide, the following artifacts from MIKE2.0 can be used to assist in this effort:

  • The SAFE Architecture provides a target architecture model at the conceptual level as well as best practice solution architecture options

Sample Customer Data Management Policies that relate to customer privacy, data encryption and information sharing. Information Security Deliverable Template provides an output template that can be used to capture key information security standards In addition, reference other Core Solution Offerings for best design standards and implementation processes.

Relationships to other Solution Offerings

  • The Information Governance Solution Offering is a Foundational Solution and 3d secure for the MIKE2.0 Methodology. Therefore, most Solution Offerings are dependent on an Information Governance programme being in place.

Extending the Open Methodology through Solution Offerings

Listed below are proposed extensions to the Overall Implementation Guide to meet the requirements for this Solution Offering:

Potential Activity Changes

Organisational QuickScan

The scope of QuickScan needs to be expanded to more broadly cover Information Management. IM QuickScan, for example, is primarily focused around Enterprise Data Management and, in particular, on Data Quality and Data Governance.

Data Governance Sponsorship and Scope

This activity will be expanded to also give representation to unstructured content for complete coverage for the Information Governance offering.

Initial Data Governance Organisation

This activity will be expanded to also give representation to unstructured content for complete coverage for the Information Governance offering. For unstructured data assessment, see ECM Maturity Model (ecm3).

Wiki Contributors
Collapse Expand Close

View more contributors