Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Archive for January, 2015

by: Robert.hillard
25  Jan  2015

The architecture after cloud

I think that Zach Nelson (Netsuite’s CEO) was wrong when he said that “cloud is the last computing architecture” but I also believe that his quote is a healthy challenge to take computing and business architectures to a new level.

Nelson went on to say “I don’t think there is anything after it (cloud). What can possibly be after being able to access all of your data any time, anywhere and on any device? There is nothing.”  His comments are available in full from an interview with the Australian Financial Review.

Aside from cloud, our industry has had a range of architectures over the decades including client/server, service oriented architecture (SOA) and thin client.  Arguably design patterns such as 4GLs (fourth generation languages) and object oriented programming are also architectures in their own right.

I think that we can predict attributes of the next architecture by looking at some of the challenges our technologies face today.  Some of these problems have defined solutions while others will require inventions that aren’t yet on the table.


Near the top of any list is security.  Our Internet-centred technology is increasingly exposed to viruses, hackers and government eavesdroppers.

One reason that so much of our technology is vulnerable is that most operating systems share code libraries between applications.  The most vulnerable application can leave the door open for malicious code to compromise everything running on the same machine.  This is part of the reason that the Apple ecosystem has been less prone to viruses than the Windows platform.  Learning from this, it is likely that the inefficiency of duplicating code will be a small price to pay for siloing applications from each other to reduce the risk of cross-infection.

At the same time our “public key” encryption is regarded by many as being at risk from increasing computing power for brute force code cracking and even the potential of future quantum computers.

Because there is no mathematical proof that encryption based on the factoring of large numbers won’t be cracked in the future it can be argued to be unsuitable for a range of purposes such as voting.  A future architecture might consider more robust approaches such the physical sharing of ciphers.


Societies around the world are struggling with defining what law enforcement and security agencies should and shouldn’t have access to.  There is even a growing debate about who owns data.  As different jurisdictions converge on “right to be forgotten” legislation, and societies agree on what back doors keys to give to various agencies, future architectures will be key to simplifying the management of the agreed approaches.

The answers will be part of future architectures with clearer tracking of metadata (and indeed definitions of what metadata means).  At the same time, codification of access to security agencies will hopefully allow users to agree with governments about what can and can’t be intercepted.  Don’t expect this to be an easy public debate as it has to navigate the minefield of national boundaries.

Network latency

Another issue that is topical to application designers is network latency.  Despite huge progress in broadband across the globe, network speeds are not increasing at the same rate as other aspects of computing such as storage, memory or processing speeds.  What’s more, we are far closer to fundamental limits of physics (the speed of light) when managing the transmission from servers around the world.  Even the most efficient link between New York and London would mean the round-trip for an instruction response is 0.04 seconds at a theoretical maximum of the speed of light (with no latency for routers or a network path that is not perfectly straight).  In computing terms 0.04 seconds is the pace of a snail!

The architectural solution has already started to appear with increasing enthusiasm for caching and on-device processing.  Mobile apps are a manifestation of this phenomenon which is also sometimes called edge computing.

The cloud provides the means to efficiently synchronise devices, with the benefits of computing on devices across the network with cheap and powerful processing and masses of data right on-hand.  What many people don’t realise is that Internet Service Providers (ISPs) are already doing this by caching popular YouTube and other content which is why it seems like some videos take forever while others play almost instantly.


Trying to define the true nature of a person is almost a metaphysical question.  Am I an individual, spouse, family member or business?  Am I paying a bill on my own behalf or doing the banking for my business partners?  Any future architecture will build on today’s approaches and understanding of user identity.

Regardless of whether the answer will be biometrics, social media or two-factor authentication it is likely that future architectures will make it easier to manage identity.  The one thing that we know is that people hate username/password management and a distributed approach with ongoing verification of identity is more likely to gain acceptance (see Login with social media).

Governments want to own this space but there is no evidence that there is a natural right for any one trusted organisation.  Perhaps Estonia’s model of a digital economy and e-residency might provide a clue.  Alternatively, identity could become as highly regulated as national passports with laws preventing individuals from holding more than one credential without explicit permission.

Internet of Things

The Internet of Things brings a new set of challenges with an explosion of the number of connected devices around the world.  For anyone who is passionate about technology, it has been frustrating that useful interfaces have been so slow to emerge.  We were capable of connecting baby monitors, light switches and home security for years before they were readily available and even today they are still clumsy.

Arguably, the greatest factor in the lag between the technological possibility and market availability has been the challenge of interoperability and the assumption that we need standards.  There is a growing belief that market forces are more effective than standards (see The evolution of information standards).  Regardless, the evolution of new architectures is essential to enabling the Internet of Things marathon.

Complexity and Robustness

Our technology has become increasingly complex.  Over many years we have layers building on layers of legacy hidden under facades of modern interfaces.  Not only is this making IT orders of magnitude more expensive, it is also making it far harder to create bulletproof solutions.

Applications that lack robustness are frustrating when they stop you from watching your favourite TV programme, but they could be fatal when combined with the Internet of Things and increasingly autonomous machines such as trains, planes and automobiles.

There is an increasing awareness that the solution is to evolve to simplicity.  Future architectures are going to reward simplicity through streamlined integration of services to create applications across platforms and vendors.

Bringing it all together

The next architecture won’t be a silver bullet to all of the issues I’ve raised in this post, but to be compelling it will need to provide a platform to tackle problems that have appeared intractable to-date.

Perhaps, though, the greatest accelerators of every generation of architecture is a cool name (such as “cloud”, “thin client”, “service oriented architecture” or “client/server”).  Maybe ambitious architects should start with a great label and work back to try and tackle some of the challenges that technologists face today.

Tags: ,
Category: Enterprise2.0
No Comments »

by: RickDelgado
21  Jan  2015

Where to Go to Learn About Network Security

The one thought on seemingly every business’s mind is network security. This is understandable given the damaging attacks launched on companies like Home Depot, Target, and Sony in the past year-and-a-half or so. After seeing the effects of data breaches involving large corporations, businesses are taking their security more seriously now than ever before, and many are quickly realizing it’s far from an easy task. The number of cyber threats out there seem to multiply every day, and with them the different methods and techniques they can use to breach a company’s security measures and obtain sensitive data from within. In this ever-evolving environment, it’s important that all business executives and IT security personnel stay up to date on the latest security information and learn all they can as they train to combat these threats. The question then becomes where you should go for the best in network security information and training. Luckily, the resources available are many.


When looking at materials and sources that can provide the most helpful information in improving network security, many companies will take cost into account. Some resources do have a cost associated with them, while others are perfectly free. Determining which to use will largely come down to what your company’s budget is for security purposes. Don’t let the price tag affect your opinion on the quality of the source. There are plenty of helpful security resources on the internet that are available at no cost.


Take blogs dedicated to the topic of network security, for instance. Blogs are an excellent way to find out what the latest news on the security front is. Of course cyber attacks hitting mega businesses like Target or JPMorgan Chase will almost always be front page news, but what about smaller breaches affecting lesser known companies? That’s where network security blogs come in. With blogs as a frequent resource, you can find out what kind of new threats are out there, how they’re impacting companies, and what’s being done to stop them. Network security blogs can also help you learn about new industry trends you’ll probably want to try for your company. Blogs can also give overviews of new security products and software. Some blogs are maintained by experts in the network security field, like Schneier on Security, and others come from organizations and institutes that dedicate themselves to improving network security for all organizations (the SANS Security Blogs are good examples of this type). Companies in the security field can have their own blogs as well, and though they can provide helpful information, keep in mind they’re also aimed at pushing their products.


Another free resource that shouldn’t be overlooked is YouTube. It’s one thing to read about new effective security procedures; it’s another thing to see it in action via an easy-to-follow video. The number of YouTube videos dedicated to network security is vast, but choosing which ones to view is similar to choosing which blogs to follow. Most prominent are those from security companies with a lot to gain from the more eyes they attract. While it’s often assumed that security companies focus only on videos promoting their products, many do offer educational videos that can help you learn more about network security and what’s being done to improve it. Some YouTube channels you’ll want to keep an eye on include the free video tutorials offered by CBT Nuggets and the useful videos from McAfee, which is now part of Intel Security. One word of caution: always check the upload date of the video. A video from five years ago may not have the updated information needed to fight today’s security threats.


Beyond the free sources, there are also learning opportunities you can pay for. Many educational institutions have online courses you can take that will give you the skills and knowledge that will help you know what to do for you company’s network security. Gaining these certifications does require extra work, some or most of it likely to be done outside the office, but it can all be worth it in the end. If the goal is to prevent security breaches, combining courses with blogs and online videos can make that goal far more achievable. Look at all of these resources as tools you can add to your arsenal of knowledge and expertise. You and your company will be much better off because of your dedication to learning all you can.


Category: Information Development
No Comments »

by: Gil Allouche
15  Jan  2015

Keeping Big Data Secure: Should You Consider Data Masking?

Big data is a boon to every industry. And as data volumes continue their exponential rise, the need to protect sensitive information from being compromised is greater than ever before. The recent data breach of Sony Pictures, and new national threats from foreign factions serve as a cautionary tale for government and private enterprise to be constantly on guard and on the lookout for new and better solutions to keep sensitive information secure.

One security solution, “data masking”, is the subject of a November 2014 article on

In the article, Ted Girard, a vice president at Delphix Federal, defines what data masking is—along with its applications in the government sector. Being that data masking also has non-government applications, organizations wondering if this solution is something they should consider for original production data should find the following takeaways from the Nextgov article helpful.

The information explosion
Girard begins by stating the plain and simple truth that in this day and age of exploding volumes of information, “data is central to everything we do.” That being said he warns that, “While the big data revolution presents immense opportunities, there are also profound implications and new challenges associated with it.” Among these challenges, according to Girard, are protecting privacy, enhancing security and improving data quality. “For many agencies just getting started with their big data efforts”, he adds, “these challenges can prove overwhelming.”

The role of data masking
Speaking specifically of governmental needs to protect sensitive health, education, and financial information, Girard explains that data masking is, “a technique used to ensure sensitive data does not enter nonproduction systems.” Furthermore, he explains that data masking is, “designed to protect the original production data from individuals or project teams that do not need real data to perform their tasks.” With data masking, so-called “dummy data”— a similar but obscured version of the real data—is substituted for tasks that do not depend on real data being present.

The need for “agile” data masking solutions
As Girard points out, one of the problems associated with traditional data masking is that, “every request by users for new or refreshed data sets must go through the manual masking process each time.” This, he explains, “is a cumbersome and time-consuming process that promotes ‘cutting corners’– skipping the process altogether and using old, previously masked data sets or delivering teams unmasked versions.” As a result, new agile data masking solutions have been developed to meet the new demands associated with protecting larger volumes of information.

According to Girard, the advantage of agile data masking is that it, “combines the processes of masking and provisioning, allowing organizations to quickly and securely deliver protected data sets in minutes.”

The need for security and privacy
As a result of collecting, storing and processing sensitive information of all kinds,
government agencies need to keep that information protected. Still, as Girard points out, “Information security and privacy considerations are daunting challenges for federal agencies and may be hindering their efforts to pursue big data programs.” The good news with “advance agile masking technology”, according to Girard, is that it helps agencies, “raise the level of security and privacy assurance and meet regulatory compliance requirements.” Thanks to this solution, Girard says that, “sensitive data is protected at each step in the life cycle automatically.”

Preserving data quality
Big data does not necessarily mean better data. According to Girard, a major cause of many big data project failures is poor data. In dealing with big data, Girard says that IT is faced with two major challenges:

1. “Creating better, faster and more robust means of accessing and analyzing large data sets…to keep pace.”
2. “Preserving value and maintaining integrity while protecting data privacy….”

Both of these challenges are formidable, especially with large volumes of data migrating across systems. As Girard explains, “…controls need to be in place to ensure no data is lost, corrupted or duplicated in the process.” He goes on to say that, “The key to effective data masking is making the process seamless to the user so that new data sets are complete and protected while remaining in sync across systems.”

The future of agile data masking
Like many experts, Girard predicts that big data projects will become a greater priority for government agencies over time. Although not mentioned in the article, the NSA’s recent installation of a massive 1.5 billion-dollar data center in Utah serves as a clear example of the government’s growing commitment to big data initiatives. In order to successfully analyze vast amounts of data securely and in real time going forward, Girard says that agencies will need to, “create an agile data management environment to process, analyze and manage data and information.”

In light of growing security threats, organizations looking to protect sensitive production data from being compromised in less-secure environments should consider data masking as an effective security tool for both on-premise and cloud-based big data platforms.

Tags: ,
Category: Enterprise Data Management
No Comments »

by: Bsomich
10  Jan  2015

MIKE2.0 Community Update

Missed what’s been happening in the MIKE2.0 data management community? Read on!



What is an Open Methodology Framework? 

An Open Methodology Framework is a collaborative environment for building methods to solve complex issues impacting business, technology, and society.  The best methodologies provide repeatable approaches on how to do things well based on established techniques. MIKE2.0′s Open Methodology Framework goes beyond the standards, techniques and best practices common to most methodologies with three objectives:

  • To Encourage Collaborative User Engagement
  • To Provide a Framework for Innovation
  • To Balance Release Stability with Continuous Improvement

We believe that this approach provides a successful framework accomplishing things in a better and collaborative fashion. What’s more, this approach allows for concurrent focus on both method and detailed technology artifacts. The emphasis is on emerging areas in which current methods and technologies lack maturity.

The Open Methodology Framework will be extended over time to include other projects. Another example of an open methodology, is open-sustainability which applies many of these concepts to the area of sustainable development. Suggestions for other Open Methodology projects can be initiated on this article’s talk page.

We hope you find this of benefit and welcome any suggestions you may have to improve it.


MIKE2.0 Community

Popular Content

Did you know that the following wiki articles are most popular on Google? Check them out, and feel free to edit or expand them!

What is MIKE2.0?
Deliverable Templates
The 5 Phases of MIKE2.0
Overall Task List
Business Assessment Blueprint
SAFE Architecture
Information Governance Solution

Contribute to MIKE:

Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Content Model
MIKE2.0 Governance

Join Us on

Follow Us on
43 copy.jpg

Join Us on


This Week’s Blogs for Thought:

Security Concerns for the IoT in Healthcare

It’s easy to get caught up in all the hype surrounding the latest buzzwords in technology. The Internet of Things (IoT) is one such term many insiders are talking about excitedly, and it’s already making a big difference in certain industries. The idea of having objects capable of collecting data while connected to the internet while also communicating with each other is an intriguing one that’s poised to explode. Gartner is even predicting that the number of devices connected to the IoT will grow to 26 billion by the year 2020. Industries all over the world are investigating how the Internet of Things can benefit them, and the healthcare industry is no exception.

Read more.

The Internet of Misfit Things

One of the hallmarks of the holiday season is Christmas television specials. This season I once again watched one of my favorite specials, Rudolph the Red-Nosed Reindeer. One of my favorite scenes is the Island of Misfit Toys, which is a sanctuary for defective and unwanted toys.

This year the Internet of Things became fit for things for tracking fitness. While fitness trackers were among the hottest tech gifts this season, it remains to be seen whether these gadgets are little more than toys at this point in their evolution.Read more.

Our Machines Will Not Outsmart Us

Over the millennia we have been warned that the end of the world is nigh.  While it will no doubt be true one day, warnings by Stephen Hawking in a piece he co-authored on artificial intelligence don’t fill me with fear (See Transcending Complacency on Superintelligent Machines).  I disagree with the commentators across the board who are warning that the machine will outsmart us by the 2030s and that it could become a Terminator-style race between us and them.

Hawking and his co-authors argue that “[s]uccess in creating AI would be the biggest event in human history.  Unfortunately, it might also be our last unless we learn how to avoid the risks.”

Read more.


Forward to a Friend!

Category: Information Development
No Comments »

by: RickDelgado
08  Jan  2015

Security Concerns for the Internet of Things in Healthcare

It’s easy to get caught up in all the hype surrounding the latest buzzwords in technology. The Internet of Things (IoT) is one such term many insiders are talking about excitedly, and it’s already making a big difference in certain industries. The idea of having objects capable of collecting data while connected to the internet while also communicating with each other is an intriguing one that’s poised to explode. Gartner is even predicting that the number of devices connected to the IoT will grow to 26 billion by the year 2020. Industries all over the world are investigating how the Internet of Things can benefit them, and the healthcare industry is no exception. But as much as the IoT can lead to some notable advantages in healthcare, a number of concerns have arisen that could derail any momentum. These concerns need to be addressed if those in healthcare will want to get the most out of the IoT.


While some ways the IoT can help in healthcare are still theoretical, a good number of devices and practices are already being used to positive effect. Health monitors have become much more versatile and common, like, for example, remote heart rate monitors used at hospitals. What sets these monitors apart is how, via their connection through the IoT, they can communicate with other medical devices and medical personnel to send out alerts in the event that vital signs fall into dangerous territory. Another new technology associated with the Internet of Things is smart beds. These beds detect when a patient is resting on them and can notify the hospital when the patient tries to get up. Smart beds can also take data and modify the pressure being put on the patient to allow for maximum comfort and recovery.


Much of the aim of IoT-connected devices lies in decreasing the amount of direct face-to-face interaction patients need with their doctors. IoT devices also help to keep doctors and nurses informed by collecting and supplying large amounts of data, helping them track health information at a level never before realized. The Internet of Things can also drastically reduce the amount of healthcare device downtime by alerting manufacturers, technicians, and managers when devices are about to break down. Parts can be shipped and repair technicians can be summoned almost immediately, and it’s thanks to the sensors being embedded within the devices themselves.


While all of these breakthroughs point to added convenience and improved patient health, there is still a significant danger many experts are warning about. In light of the recent hackings hitting major companies over the last few years, security issues remain a top concern that has made hospitals reluctant to truly dive into the IoT’s potential. Perhaps the main overriding concern deals particularly with securing health data and patient privacy. By using devices that are constantly connected to the internet and having them store, process, and transfer health data, the danger of having hackers gain possession of sensitive information is very real. In some cases, health data is more valuable on the black market than even bank account information, and considering that networked devices can be vulnerable to hacking likely has many in the healthcare industry approaching the issue cautiously. Even more serious is the potential for hackers to actually infiltrate devices and use them to attack the patient. This has already been demonstrated in certain environments, and while no actual attacks have been recorded yet, the danger is real and needs to be addressed.

The Internet of Things isn’t going away anytime soon, so the healthcare industry will need to focus on improving IT security as it adopts connected health devices. This may require constant updates and patching to the devices, ensuring that any security holes are plugged up and minimizing the possibility of cyber attacks. Even the federal government is getting involved in the form of the FDA issuing guidelines and standards that all networked medical devices need to meet before being used. Some hospitals are even going as far as internally hosting data to maintain more control, although that does somewhat limit IoT devices. Whatever method is used, the Internet of Things could revolutionize the healthcare industry. Caution is needed, however, to ensure patients’ security is protected every step of the way.

Category: Information Development
No Comments »

Collapse Expand Close
TODAY: Tue, March 19, 2019
Collapse Expand Close
Recent Comments
Collapse Expand Close