Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Posts Tagged ‘Information Value’

by: Robert.hillard
20  Apr  2014

How CIOs can discuss the contribution of IT

Just how productive are Chief Information Officers or the technology that they manage?  With technology portfolios becoming increasingly complex it is harder than ever to measure productivity.  Yet boards and investors want to know that the capital they have tied-up in the information technology of the enterprise is achieving the best possible return.

For CIOs, talking about value improves the conversation with executive colleagues.  Taking them aside to talk about the success of a project is, even for the most strategic initiatives, usually seen as a tactical discussion.  Changing the topic to increasing customer value or staff productivity through a return on technology capital is a much more strategic contribution.

What is the return on an IT system?

There are all sorts of productivity measures that can be applied to individual systems, but they are usually based on the efficiency of existing processes which leads to behaviours which reduce flexibility.  The future of business and government depends on speed of response to change, not how efficiently they deal with a static present.

Businesses invest in information systems to have the right information at the right time to support decisions or processes.  Information that is used is productive while information that is collected, but poorly applied, is wasted or unproductive.

However, to work out what proportion of information is being used there needs to be a way to quantify it.

How much information is contained in the systems?

There is a formal way to measure the quantity of information.  I introduce this extensively in chapter 6 of Information-Driven Business.

The best way to understand “quantity” in terms of information is to count the number of artefacts rather than the number of bits or bytes required to store them.  The best accepted approach to describing this quantity is called “information entropy” which, confusingly, uses a “bit” as its unit of measure which is a count of the potential permutations that the system can represent.

A system that holds 65,536 names has just 16 “bits” of unique information (log265536).  That might sound strange given that the data storage of 65,536 names might use of the order of 6MB.

To understand why there only 16 bits of unique information in a list of 65,536 names consider whether the business uses the spelling of the names of if there is any additional insight being gained from the data that is stored.

How much of that information is actually used?

Knowing how much information there is in a system opens up the opportunity to find how much information is being productively used.  The amount of information being used to drive customer or management choices is perhaps best described as “decision entropy”.  The decision entropy is either equal or less than the total information entropy.

An organisation using 100% of their available information is incredibly lean and nimble.  They have removed much of the complexity that stymies their competitors (see Value of decommissioning legacy systems).

Of course, no organisation productively uses all of the information that they hold.  Knowing that holding unproductive information comes at a cost to the organisation, the CIO can have an engaging conversation with fellow executives about extracting more value from existing systems without changing business processes.

When looking at how business reports are really used, and how many reports lie unread on management desks, there is a lot of low hanging fruit to be picked just by improving the way existing business intelligence is used.

Similarly, customer systems seldom maximise their use of hints based on existing information to guide buyers to close the best available offer.  A few digital enhancements at the front line can bring to the surface a vast array of otherwise unused information.

Changing the conversation

Globally, CIOs are finding themselves pushed down a rung in the organisational ladder.  This is happening at the very same time that technology is moving from the back office to become a central part of the revenue story through digital disruption.

CIOs are not automatically entitled to be at the executive table.  They have to earn the right by contributing to earnings and business outcomes.  One of the best discussions for a CIO to focus on is increasing productivity of the capital tied-up in the investments that have already been made in the systems that support staff and customers.

Tags: , ,
Category: Information Development, Information Strategy, Information Value
No Comments »

by: Robert.hillard
18  Dec  2011

Value of decommissioning legacy systems

Most organisations reward their project managers for achieving scope, within a given timeframe for a specified budget.  While scope is usually measured in terms of user functions most projects usually include the decommissioning of legacy systems.  Unfortunately it is the decommissioning step which is most often compromised in the final stages of any project.

I’ve previously written about the need to measure complexity (see CIOs need to measure the right things).  One of the pieces of feedback I have received from a number of CIOs over the past few months has been that it is very hard to get a business case for decommissioning over the line from a financial perspective.  What’s more, even when it is included in the business case for a new system, it is very hard to avoid it being removed during the inevitable scope and budget management that most major projects go through.

One approach to estimating the benefit of decommissioning is to list out the activities that will be simpler as a result of removing the system.  These can include duplicated user effort, reduced operational management costs and, most importantly, a reduction in the effort to implement new systems.  The problem is that last of these is the most valuable but is very hard to estimate deterministically.  Worse, by the time you do know, it is likely to be too late to actually perform the decommissioning.  For that reason, it is better to take a modelling approach across the portfolio rather than try to prove the cost savings using a list of known projects.

The complexity that legacy systems introduce to new system development is largely proportional to the cost of developing information interfaces to those systems.  Because the number of interfaces grow to the power of the number of systems, the complexity they introduce is a logarithmic function as shown below.

Figure 1

Any model is only going to provide a basis for estimating, but I outline here a simple and effective approach.

Start by defining c as the investment per new system and n as the number of system builds expected over 5 years.  Investment cost for a domain is therefore c times n.  For this example assume c as $2M and n as 3 giving an investment of $6M.

However the number of legacy systems (l) add complexity at a rate that rapidly increases most initially before trailing off (logarithmic).  The complexity factor (f), is dependent on the ratio of the cost of software to development (c) to the average interface cost (i):


For this example assume l as 2 and i as $200K:


The complexity factor can then be applied to the original investment:

c x n x (f+1)

In this example the five year saving of decommissioning the three systems in terms of future savings would be of the order of $2.9M.  It is important to note that efficiencies in interfacing similarly provide benefit.  As the cost of interfacing drops the logarithm base increases and the complexity factor naturally decreases.  Even arguing that the proportion of the ratio needs to be adjusted doesn’t substantially affect the complexity factor.

While this is only one method of identifying a savings trend, it provides a good starting point for more detailed modelling of benefits.  At the very least it provides the basis for an argument that the value of decommissioning legacy system shouldn’t be ignored simply because the exact benefits cannot be immediately identified.

Tags: ,
Category: Information Governance

by: Robert.hillard
25  Sep  2011

Paying for value rather than activity

Since the 1980s the costs associated with functions that are shared have been increasingly allocated to business units in such a way as to drive accountability.

For information technology this was relatively easy in the late 1980s as the majority of costs were associated with the expense of infrastructure or processing.  Typically the cost of the mainframe was allocated based on usage.  Through the 1990s, costs moved increasingly to a project focus with a model that encouraged good project governance and the allocation of infrastructure based on functions delivered.

Arguably, the unfortunate side effect of the allocation of project costs has been that many business units see information technology as being unnecessarily expensive – whereas many of the costs are really just reflecting the sheer complexity of business technology (see my previous post: CIOs need to measure the right things).  Such an approach to cost allocation has allowed business units to execute projects of increasing sophistication; however it may not be ensuring that information technology is being used in the way that will achieve the greatest possible strategic impact.

The other problem with the project-focused approach to cost recovery is that the CIO’s role is diminished to being that of a service provider.  In some organisations this has gone so far as to result in the CIO seeing external service providers as competitors.

Refocusing the cost recovery to the value achieved has the potential to deal the CIO back into the strategic debate.  As I’ve said before, information technology is extremely complex and requires experience and insight in order to identify the real opportunities to use it effectively to support and differentiate any business.  During the next decade, we are likely to see the continued blurring of the lines between internal business technology, joint ventures and the products that consumers use.  For instance, joint venture partners expect to see detailed financial reports across boundaries and consumers are used to helping themselves through the same interfaces that were previously restricted to call centre operators.

Recently on the web there has been some discussion on whether information should be valued.  At the same time there has been good progress in the development of techniques to value information assets (for instance, see the MIKE2.0 article on Information Economics).  The value of information is a very good way of predicting likely business value, even when the way that value will be realised has not yet been determined.  The disruption to previously stable businesses, such as retail, telecommunications and manufacturing, are very good examples of why it is important to understand value beyond the horizon of current revenue models.

Allocating at least some of the cost of an effective information technology department based on value focuses the budget debate on the development of revenue earning products that will leverage these new capabilities.  It also ensures that those units receiving the greatest potential value are motivated to realise it.  Finally, the move away from a largely activity-based approach to measuring cost reduces the tendency to continually cut project scopes to keep within defined budget.

Tags: ,
Category: Information Governance

by: Robert.hillard
24  Jul  2011

CIOs need to measure the right things

If you’re a Chief Information Officer (CIO) there are three things that your organization expects of you: 1) keep everything running; 2) add new capabilities; and 3) do it all as cheaply as possible.  The metrics that CIOs typically use to measure these things include keeping a count of the number of outages, number of projects delivered and budget variances.  The problem with this approach is that it fails to take account of complexity.

When I talk with senior executives, regardless of their role, the conversation inevitably turns to the frustration that they feel about the cost and complexity of doing even seemingly simple things such as preparing a marketing campaign, adding a self-service capability or combining two services into one.  No matter which way you look at it, it costs more to add or change even simple things in organisations due to the increasing complexity that a generation of projects have left behind as their legacy.  It should come as no surprise that innovation seems to come from greenfield startups, many of which have been funded by established companies who’s own legacy stymies experimentation and agility.

This doesn’t have to be the case.  If a CIO accepts the assertion that complexity caused by the legacy of previous projects is the enemy of agility, then they should ask whether they are explicitly measuring the complexity that current and future projects are adding to the enterprise.  CIOs need to challenge themselves to engineer their business in such a way that it is both flexible and agile with minimal complexity compared to a startup.

The reason that I spend so much time writing about information rather than processes or business functions is that the modern enterprise is driven by information.  So much so, that metrics tracking the management and use of information are very effective predictors of issues with the development of new processes and obstacles to the delivery of new functionality.  There appear to be no accurate measures of the agility of enterprise technology that focus on just processes or functions without information but there are measures of information that safely ignore processes and functions knowing that well organized information assets enable new processes and functions to be created with ease.

The CIO who wants to ensure their organisation has the capability to easily implement new functions in the future should look to measure how much information the organization can handle without introducing disproportionate complexity.  The key is in structuring information assets in such a way as to ensure that complexity is compartmentalized within tightly controlled units with well understood boundaries and properly defined interfaces to other information assets.  These interfaces act as dampeners of complexity or turbulence, allowing for problems or changes to be constrained and their wider impact minimized.

Creating such siloes may seem to go against conventional wisdom of having an integrated atomic layer and enterprise approach to data.  Nothing could be further from the truth, it is simply ensuring that the largest possible quantity of information is made available to all stakeholders with the minimum possible complexity.

The actual measures themselves are described in my book, Information-Driven Business,  as the “small worlds data measure” for complexity and “information entropy” for the quantity.  Applying the measures is surprisingly easy, the question each CIO then needs to answer is how to describe these measures in a way that engages their business stakeholders.  If technology leaders are hoping to avoid difficult topics with their executive counterparts then this will be impossible, but if they are willing to share their inside knowledge on modern information technology then the “us and them” culture can start to be broken down.

Tags: , ,
Category: Information Governance, Information Strategy, Information Value

by: Sean.mcclowry
20  Jan  2008

Sustainable Development requires Information Development

Climate change is increasingly a mainstream issue. There is growing momentum to address the issue at a number of levels, in the corporate sector, with private citizens and within government.

There is, however, a real danger with the trend towards carbon neutrality that is analogous to the poor systems designs. As described by the United Nations, climate change is just 1 of the 36 areas of Sustainable Development. An Information Development approach can help:

  • Understand the impacts of decisions and how they impact other areas of Sustainability (e.g. how a programme that helps reduce a retailer’s carbon footprint by stopping imports from Africa impacts other areas).

  • Collaboratively develop standards and share lessons learned. This will be a major benefit as the transformation organizations must go through is so significant. This isn’t necessarily “trade secrets” but an open and transparent forum for them to share information.

  • Collaborative Business Intelligence platform to make objective decisions. Decisions can be based on a historical evidence and make us of historical models. Policy analysis, researches and the public can share different forms of information as part of the process.

  • Information Sharing across different organisations, involving operational and analytical information. Enablers include open and common standards, search and collaboration. Some information can be anynoymised while other content can seen by both parties.

  • Collaboratively develop standards and share lessons learned. This will be a major benefit as the transformation organizations must go through is so significant. This isn’t necessarily “trade secrets” but an open and transparent forum for them to share information.

  • Measure progress based on standard metrics. We need standards because when we don’t account for what we produce, we may get unexpected issues. Information Quality issues are analogous to the pollutants we see from poor sustainability design.

The Information Development approach to Sustainable Development can be applied to design the interaction points between the different areas. It also means the ability to make fact-based decisions, share information between systems and provide easy access to information from complex, federated sources.

Tags: , , , ,
Category: Sustainable Development
No Comments »

by: Robert.hillard
18  Oct  2007

The Board, the C-suite and the Middle Manager

One of the key questions is who should sponsor Information Managements.  The governance sections of MIKE2.0 describes operational organizations and how to get there, but makes the assumption that the CEO, CIO, CFO etc. are supportive of the initiative and will act as sponsors.  What happens when they’re not?

Actually, it seems that this is the case more often than you would wish with many senior executives unwilling to commit to the proper management of information.  It’s not hard to work out the reason why, in most companies (and increasingly in many government organizations) the CEO is only appointed for a short contract with rapid rotation of new talent into the role.  No wonder the CEO acts like a politician looking for the “quick fix” common sense answer that they can put in place within their term and position themselves to be extended (analogous to a politician seeking re-election),

There is hope, however, by looking at the board.  In most companies, board members have a longer tenure than CEOs and also feel more exposed to legal issues.  A quick conversation about the issues of ledger versus non-ledger data (discussed before in this blog) highlights to board members how great their exposure is if they don’t mandate better governance.  Judicious use of passionate middle managers can complete the pincer movement and before you know it the CEO sees Information Management as a mandatory activity and a quick win.

Tags: ,
Category: Information Governance, Information Strategy

Collapse Expand Close
TODAY: Tue, March 19, 2019
Collapse Expand Close
Recent Comments
Collapse Expand Close