Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.
by: RickDelgado
30  Jul  2015

Enterprise Storage: Keeping Up with the Data Deluge

The increasing demands found in data centers can be difficult for most people to keep up with. We now live in a world where data is being generated at an astounding pace, which has lead to expert coining the phrase “big data.” All that generated data is also being collected, which creates even bigger demands for enterprise data storage. Consider all the different trends currently going around, from video and music streaming to the rise of business applications to detailed financial information and even visual medical records. It’s no wonder that storage demands have risen around 50 percent annually in the past few years, and there appears to be nothing on the horizon that will slow that growth. Companies have reason for concern as current data demands threaten to stretch their enterprise storage to its breaking point, but IT departments aren’t helpless in this struggle. This data deluge can be managed; all that’s needed are the right strategies and technologies to handle it.

It isn’t just the fact that so much new data needs to be stored, it’s that all the data should be stored securely while still allowing authorized personnel to access it efficiently. Combine that with the rapidly changing business environment where needs can evolve almost on a daily basis and the demands for an agile and secure enterprise storage system can overwhelm organizations. The trick is to construct infrastructure that can manage these demands. A well designed storage network can relieve many of the headaches that are generated when dealing with large amounts of data, but such a network requires added infrastructure support.

Luckily, IT departments have many options they can choose from that can meet the demands of the data deluge. One of the most popular at the moment is storage virtualization. This technology basically works by combining multiple network storage devices so that they appear to be only one unit. The components for a virtualized storage system, however, can be a tough decision for companies to make. Network attached storage (NAS), for instance, helps people within an organization access the same data at the same time. Storage area networks (SAN) help make planning and implementing storage solutions much easier. Both carry certain advantages over the more traditional direct-attach storage (DAS) deployments seen in many businesses. DAS simply comes with too many risks and downsides, making it a poor choice when confronting the current data challenges many companies face. Whether choosing NAS or SAN, both can simplify storage administration, an absolute must when storage management has become so complex. They also reduce the amount of hardware needed thanks to converged infrastructure technology.

But these strategies aren’t the only one companies can use to keep up with enterprise storage demands. Certain administrative tactics can be deployed to handle the growing volume and complexity of the current storage scene. Part of that strategy is avoiding certain mistakes, such as storing non-critical data on costly storage devices. There’s also the problem of storing too much. In some cases, business leaders ask IT workers to store multiple copies of information, even when the multiple copies aren’t needed. IT departments need to work closely with the business side of the company to devise the right strategy to avoid these unnecessary complications. By streamlining the process, it can become easier to manage storage.

Other options are also readily available to meet enterprise storage demands. Cloud storage, for example, has quickly become mainstream and comes with attractive advantages, such as easy scalability when businesses need it and the ability to access data from almost anywhere. Concerns over data security have made some businesses reluctant to adopt the cloud, but many cloud storage vendors are trying to address those worries with greater emphasis on security features. Hybrid storage solutions are also taking off in popularity in part because they mix many of the advantages found in other storage options.

With the demands large amounts of data are placing on enterprise storage, IT departments are searching for the answers that can help them keep up with these challenges. The options are there that help meet these demands, but it’s up to companies to fully deploy those solutions. Data continues to be generated at a breakneck pace, and that trend won’t be slowing down anytime soon. It’s up to organizations to have the right strategies and technology in place to take full advantage of this ongoing data deluge.


Category: Enterprise Data Management
No Comments »

by: Robert.hillard
25  Jul  2015

Don’t seek to know everything about your customer

I hate customer service surveys. Hotels and retailers spend millions trying to speed our checkout or purchase by helping us avoid having to wait around. Then they undo all of that good work by pestering us with customer service surveys which take longer than any queue that they’ve worked so hard to remove!

Perhaps I’d be less grumpy if all of the data that organisations spend so much time, much of it ours, collecting was actually applied in a way that provided tangible value. The reality is that most customer data simply goes to waste (I argue this in terms of “decision entropy” in chapter 6 of my book, Information-Driven Business).

Customer data is expensive

Many years ago, I interviewed a large bank about their data warehouse. It was the 1990s and the era of large databases was just starting to arrive. The bank had achieved an impressive feat of engineering by building a huge repository of customer data, although they admitted it had cost a phenomenal sum of money to build.

The project was a huge technical success overcoming so many of the performance hurdles that plagued large databases of the time. It was only in the last few minutes of the interview that the real issue started to emerge. The data warehouse investment was in vain, the products that they were passionate about taking to their customers were deliberately generic and there was little room for customisation. Intimate customer data was of little use in such an environment.

Customer data can be really useful but it comes at a cost. There is the huge expense of maintaining the data and there is the good will that you draw upon in order to collect it. Perhaps most importantly, processes to identify a customer and manage the relationship add friction to almost every transaction.

Imagine that you own a clothing or electrical goods store. From your vantage point behind the counter you see a customer run up to you with cash in one hand and a product in the other. They look like they’re in a hurry and thrust the cash at you. Do you a) take the cash and thank them; or b) ask them to stop before they pay and register for your loyalty programme often including a username and password? It’s obvious you should go option a, yet so many retailers go with option b. At least the online businesses have the excuse that they can’t see the look of urgency and frustration in their customers’ eyes, it is impossible to fathom why so many bricks-and-mortar stores make the same mistake!

Commoditised relationships aren’t bad

Many people argue that Apple stores are close to best practice when it comes to retail, yet for most of the customer interaction the store staff member doesn’t know anything about the individual’s identity. It is not until the point of purchase that they actually access any purchase history. The lesson is that if the service is commoditised it is better to avoid cluttering the process with extraneous information.

Arguably the success of discount air travel has been the standardisation of the experience. Those who spend much of their lives emulating the movie Up in the Air want to be recognised. For the rest of the population, who just want to get to their destination at the lowest price possible while keeping a small amount of comfort and staying safe, a commoditised service is ideal. Given the product is not customised there is little need to know much about the individual customers. Aggregate data for demand forecasting can often be gained in more efficient ways including third party sources.

Do more with less

Online and in person, organisations are collecting more data than ever about their customers. Many of these organisations would do better to focus on a few items of data and build true relationships by understanding everything they can from these small number of key data elements. I’ve previously argued for the use of a magic 150 or “Dunbar’s number” (see The rule of 150 applied to data). If they did this, not only would they be more effective in their use of their data, they could also be more transparent about what data they collect and the purposes to which they put it.

People increasingly have a view of the value of their information and they often end-up resenting its misuse. Perhaps, the only thing worse than misusing it is not using it at all. There is so much information that is collected that then causes resentment when the customer doesn’t get the obvious benefit that should have been derived. Nothing frustrates people more than having to tell their providers things that are obvious from the information that they have already been asked for, such their interests, family relationships or location.

Organisations that don’t heed this will face a backlash as people seek to regain control of their own information (see You should own your own data).

Customers value simplicity

In this age of complexity, customers are often willing to trade convenience for simplicity. Many people are perfectly happy to be a guest at the sites they use infrequently, even though they have to re-enter their details each time, rather than having to remember yet another login. They like these relationships to be cheerfully transactional and want their service providers to respect them regardless.

The future is not just more data, it is more tailored data with less creepy insight and a greater focus on a few meaningful relationships.

Category: Enterprise Data Management, Enterprise2.0, Information Development, Master Data Management
No Comments »

by: RickDelgado
21  Jul  2015

Anxious About BYOD? Here are Some Tips for Success

Has your organization caved to the pressure of establishing a Bring Your Own Device (BYOD) policy and is now having second thoughts? Making company-wide policy changes and satisfying tech-savvy employees’ desires is just the beginning. Once BYOD is up and running, there are many challenges. The difference between success and failure means addressing key concerns and finding ways to overcome these issues.

 Mobile Device Management

 Security is undoubtedly the most pressing concern with BYOD. Even with a sound policy, the rapidly shifting security landscape is a challenge. The constant updating of devices is, too. You must constantly adapt your threat defenses and corporate policies. Mobile Device Management (MDM) provides many benefits, including a centralized view of data stored on devices. There are many cases of unhappy employees misusing sensitive information or hackers accessing vulnerable mobile networks. The safest approach is when administrators can see the first signs of a breach and take action.

An MDM system provides access control and monitoring of corporate data. Information on a stolen or lost device can be immediately erased. Mobile apps have caused challenges of their own. Many of them collect personal data and store them in the cloud. An important feature to look for is Mobile Application Management, which keeps track of all the apps on your mobile network and even blocks ones known to be particularly risky.

 Vendor Managed Services

 Not every company employs the most needed talent. A cost-effective way to offset this imbalance is to pursue vendor managed services. Consulting organizations have emerged in the mobile era and employ the technology, tools, and methods to efficiently manage data. DataXoom, a mobile virtual network operator, provides MDM, asset management, and even assistance with procuring the best hardware and software. The ultimate goal is to manage the financial cost of bring your own device and managing data on and accessed by mobile devices.

Stay Compliant

Compliance with the latest standards is essential for keeping BYOD in your company. The Payment Card Industry Data Security Standard 3.0 is one you should be following. It provides guidelines and testing procedures related to building a secure network, protecting cardholder data, and implementing effective access control. Also covered are monitoring and testing and maintaining an information security policy that includes all devices, systems, and personnel. The PCI DSS 3.0 standard is also a guideline for internal and external auditors.

Fine Tune Your Policy

A BYOD policy isn’t static. It needs to adapt to changing security risks and company requirements. For the policy to work, you need to identify what devices are permitted on the network, and control information access down to the individual device. Administrators also need to think about password complexity, screen locking, and other security measures.

Other elements of your policy should outline how technical support operates. Also include permitted apps and rules for acceptable websites, materials, and how all of these are monitored. In addition to governing usage, your leaders should also have a plan for what happens when an employee leaves the company. Do they return the phone or do you just remove access to email, company apps, and data?

Some organizations have resorted to a Choose Your Own Device (CYOD) policy. Users are issued corporate owned devices. They may or may not have a pick from approved products. This gives the company more control over compliance and security, while it pays all costs related to the device.

What about Privacy?

Today’s employees have been outspoken about their rights to have personal data on the same device as their work. The challenge is businesses must protect their mobile networks against unauthorized use. Employer access rules have drawn controversy amongst IT policy drafters. While work-related data could be subject to legal investigations down the road, personal information would be exposed as well. The level of control over personal data has been less than ideal for many workers. Yet, privacy matters still need to be addressed.


These are just a few of the main issues regarding corporate BYOD. Implementing the policy takes work, but continual monitoring and adjustments are required for a successful mobile device policy. That means your company and stakeholders must adjust to change. Security challenges, compliance requirements, employee sentiment, and the devices themselves will certainly be in flux in the years to come.

Category: Enterprise Data Management
No Comments »

by: Jonathan
09  Jul  2015

Are We Missing the Mark with Real-Time Marketing?

If any press is good press, then Totinos can chalk up its latest Super Bowl marketing antics for a win. However, it’s questionable whether the brand will gain any true business value from live-tweeting the game a day early. In fact, marketers should step back and consider whether our obsession with vanity metrics and viral campaigns is distracting us from the true potential of real-time and data-driven marketing.


Mainstream real-time marketing


In short, real-time marketing normally referred to as the practice of engaging audiences with content that is relevant to a specific current event. For most brands, this content often takes the shape of “memes” shared through social media channels.


While Totinos’ day-early tweets were revealed to be a gimmick, initially many thought the company had made a significant real-time gaffe. Pre-written tweets in an attempt to be clever reflect organizations’ desire to streamline their marketing using a pre-determined formula. Better brands understand that real-time marketing has to be organic with an understanding of the target-market. Oreo’s real-time tweet during the power outage of Super Bowl XLVII was held up as a genius example of real-time marketing. While brand engagement certainly has its place, true real-time marketing that has a long-term impact on ROI is much less sexy than a clever tweet in front of a large audience.


Where real-time marketing started


While many might associate real-time marketing with the rise of big data analytics and social media, the term rose to popularity well before social media marketing and data collection took off. In fact, the term first surfaced back in 2005. Back then it wasn’t about “memes” and on-the-spot tweets, but instead web personalization.


Initially, big brands wanted to find ways to personalize their website experiences in real-time. However, the technology and software weren’t at that level, and any solutions were often expensive and not all that great. This eventually led to customized email marketing approaches and other methods, while web personalization was put on the backburner. Fast forward to today, that element has all but expired with more effort being placed on social media.


Missing the mark and taking the easy road


This is precisely where most organizations are missing the mark. Sure, Oreo’s Super Bowl tweet was amazing and produced a tremendous amount of engagement, but as mentioned earlier, real-time marketing isn’t designed for engagement. It’s meant for finding ways to create substantial long-term impacts on ROI. But that’s hard, just as it was in 2005 with web personalization. People would rather take the easy way out and point to massive amounts of social impressions instead of using data and real-time analysis to produce more value in other areas. That needs to change.


Where can real-time marketing be implemented


There are a number of different marketing approaches that stand to benefit from a real-time approach. Here are just a few examples to get your creative juices flowing.


  1. Customized landing pages


What was difficult back in 2005 is becoming a lot easier today. E-commerce sites have made the most of this, by allowing users to create personal profiles, and then offering items based on searches in real-time. This may be more difficult for other sites, but not impossible. Creating personalized landing pages based on devices used or user preferences is becoming increasingly common. Real-time abilities allow programs to make these changes on the fly, reacting to clicks and searches almost instantly.


  1. Location-based marketing


Thanks to mobile technology, primarily smartphones, and their built-in location services, marketers have the ability to tailor messages based on area like never before. For example, if users are near Wal Mart or Target, promotions could be pushed to their devices via notifications. Marketers can also use the technology to see where users shop most often, or use in-store beacons to attract shoppers. By utilizing real-time capabilities, marketers can craft individualized offers and have them activated at the right moment, when users are likely to act.


  1. Multi-channel marketing


The path to making a purchase is becoming increasingly more complex. In times past, it simply involved a trip to the local store or maybe a catalogue. Today, it often includes visiting websites, browsing social media accounts, and viewing mobile sites. In order to meet the demands, brands are forced to customized approaches for each of their channels, allowing tailored marketing efforts for each channel, while still maintaining a seamless approach as users jump from one to the other. Real-time analytics can provide organizations with constant details about which channels they’re using, and what they’ll respond best to in order to increase conversion rates.


Category: Information Development
No Comments »

by: Bsomich
28  Jun  2015

MIKE2.0 Community Update

Interested in better data management? See what’s been happening in the MIKE2.0 community this month:



Have you seen our Open MIKE Series? 

The Open MIKE Podcast is a video podcast show which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

You can scroll through the Open MIKE Podcast episodes below:

For more information on MIKE2.0 or how to get involved with our online community, please visit


MIKE2.0 Community  

Contribute to MIKE:Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Content Model
MIKE2.0 Governance

Join Us on

Follow Us on
43 copy.jpg

Join Us on

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.


This Week’s Blogs for Thought:

Key Considerations for a Big Data StrategyMost companies by now understand the inherent value found in big data. With more information at their fingertips, they can make better decisions regarding their businesses. That’s what makes the collection and analysis of big data so important today. Any company that doesn’t see the advantages that big data brings may quickly find themselves falling behind their competitors.

Read more.

The Internet was a mistake. Now let’s fix it. 

Each generation over the last century has seen new technologies that become so embedded in their lives that its absence would be unimaginable.  Early in the 20th century it was radio, which quickly become the entertainment of choice, then television, video and over the past two decades it has been the Internet. For the generation who straddles the implementation of each, there have been format and governance debates which are quickly forgotten.  Today, few remember the colour television format choice every country made between NTSC and PAL just as anyone who bought a video recorder in the early 1980s had to choose between VHS and Beta.

Read more.

Flash Quiety Taking Over Disk in a Big Data World

Right now, we live in the big data era. What was once looked at as a future trend is now very much our present reality. Businesses and organizations of all shapes and sizes have embraced big data as a way to improve their operations and find solutions to longstanding problems. It’s almost impossible to overstate just how much big data has impacted the world in such a short amount of time, affecting everyone’s life whether we truly comprehend how.

Read more.

Forward to a Friend!Know someone who might be interested in joining the Mike2.0 Community? Forward this message to a friend


If you have any questions, please email us at



Category: Information Development
No Comments »

by: Robert.hillard
20  Jun  2015

The Internet was a mistake, now let’s fix it

Each generation over the last century has seen new technologies that become so embedded in their lives that its absence would be unimaginable.  Early in the 20th century it was radio, which quickly become the entertainment of choice, then television, video and over the past two decades it has been the Internet.

For the generation who straddles the implementation of each, there have been format and governance debates which are quickly forgotten.  Today, few remember the colour television format choice every country made between NTSC and PAL just as anyone who bought a video recorder in the early 1980s had to choose between VHS and Beta.

It is ironic that arguably the biggest of these technologies, the Internet, has been the subject of the least debate on the approach to governance, standards and implementation technology.

Just imagine a world where the Internet hadn’t evolved in the way it did.  Arguably the connectivity that underpins the Internet was inevitable.  However, the decision to arbitrarily open-up an academic network to commercial applications undermined well progressed private sector offerings such as AOL and Microsoft’s MSN.

That decision changed everything and I think it was a mistake.

While the private sector offerings were fragmented, they were well governed and with responsible owners.

Early proponents of the Internet dreamed of a virtual world free of any government constraints.  Perhaps they were influenced by the end of the Cold War.  Perhaps they were idealists.  Either way, the dream of a virtual utopia has turned into an online nightmare which every parent knows isn’t safe for their children.

Free or unregulated?

The perception that the Internet is somehow free, in a way that traditional communications and sources of information are not, is misguided.

Librarians have long had extraordinary codes of conduct to protect the identity of borrowers from government eyes.  Compare that to the obligation in many countries to track metadata and the access that police, security agencies and courts have to the online search history of suspects.

Telephone networks have always been open to tapping, but the closed nature of the architecture meant that those points are governed and largely under the supervision of governments and courts.  Compare that to the Internet which does theoretically allow individuals to communicate confidentially with little chance of interception but only if you are one of the privileged few with adequate technical skill.  The majority of people, though, have to just assume that every communication, voice, text or video is open to intercept.

Time for regulation

We need government in the real world and we should look for it on the Internet.

The fact that it is dangerous to connect devices directly to the internet without firewalls and virus protection is a failure of every one of us who is involved in the technology profession.  The impact of the unregulated Internet on our children and the most vulnerable in our society reflects poorly on our whole generation.

It is time for the Internet to be properly regulated.  There is just too much risk and (poor) regulation is being put in place by stealth anyway.  Proper regulation and security would add a layer of protection for all users.  It wouldn’t remove all risk, but even the humble telephone has long been used as a vehicle for scams, however remedies have been easier to achieve and law enforcement more structured.

The ideal of the Internet as a vehicle of free expression need not be lost and in fact can be enhanced by ethically motivated governance with the principal of free speech at its core.

Net neutrality is a myth

Increasing the argument for regulation is the reality of the technology behind the Internet.  Most users assume the Internet is a genuinely flat virtual universe where everyone is equal.  In reality, the technology of the Internet is nowhere near the hyperbole.  Net neutrality is a myth and we are still very dependent on what the Internet Service Providers (ISPs) or telecommunications companies do from an architecture perspective (see The architecture after cloud).

Because the Internet is not neutral, there are winners and losers just as there are in the real world.  The lack of regulation means that they come up with their own deals and it is simply too complicated for consumers to work out what it all means for them.

Regulation can solve the big issues

The absence of real government regulation of the Internet is resulting in a “Wild West” and an almost vigilante response.  There is every probability that current encryption techniques will be cracked in years to come, making it dangerous to transmit information that could be embarrassing in the future.  This is leading to investment in approaches such as quantum cryptography.

In fact, with government regulation and support, mathematically secure communication is eminently possible.  Crypto theory says that a truly random key that is as long as the message being sent cannot be broken without a copy of the key.  Imagine a world where telecommunication providers working under appropriate regulations issued physical media similar to passports containing sufficient random digital keys to transmit all of the sensitive information a household would share in a year or even a decade.

We would effectively be returning to the model of traditional phone services where telecommunication companies managed the confidentiality of the transmission and government agencies could tap the conversations with appropriate (and properly regulated) court supervision.

Similarly, we would be mirroring the existing television and film model of rating all content on the Internet allowing us to choose what we want to bring into our homes and offices.  Volume is no challenge with an army of volunteers out there to help regulators.

Any jurisdiction can start

Proper regulation of the internet does not need to wait for international consensus.  Any one country can kick things off with almost immediate benefit.  As soon as sufficient content is brought into line, residents of that country will show more trust towards local providers which will naturally keep a larger share of commerce within their domestic economy.

If it is a moderately large economy then the lure of frictionless access to these consumers will encourage international content providers to also fall into line given the cost of compliance is likely to be negligible.  As soon as that happens, international consumers will see the advantage of using this country’s standards as a proxy for trust.

Very quickly it is also likely that formal regulation in one country will be leveraged by governments in others.  The first mover might even create a home-grown industry of regulation as well as supporting processes and technology for export!

Category: Information Governance
No Comments »

by: Jonathan
19  Jun  2015

Key Considerations for a Big Data Strategy

Most companies by now understand the inherent value found in big data. With more information at their fingertips, they can make better decisions regarding their businesses. That’s what makes the collection and analysis of big data so important today. Any company that doesn’t see the advantages that big data brings may quickly find themselves falling behind their competitors. To benefit even more from big data, many companies are employing big data strategies. They see that it is not enough to simply have the data at hand; it must be utilized in the most effective manner to maximize its potential. Coming up with the best big data strategy, however, can be difficult, especially since every organization has different needs, goals, and resources. When creating a big data strategy, it’s important for companies to consider several main issues that can greatly affect its implementation.

When first developing a big data strategy, businesses will need to look at the current company culture and change it if necessary. This essentially means to encourage employees throughout the whole organization to get into the spirit of embracing big data. That includes people on the business side of things along with those in the IT department. Big data can change the way things are done, and those who are resistant to those changes could be holding the company back. For that reason, they should be encouraged to be more open about the effect of big data and ready to accept any changes that come about. Organizations should also encourage their employees to be creative with their big data solutions, basically fostering an atmosphere of experimentation while being willing to take more risks.

As valuable as big data can be, simply collecting it for the sake of collecting big data will often result in failure. Every big data strategy needs to account for specific business objectives and goals. By identifying precisely what they want to do with their data, companies can enact a strategy that drives toward that single objective. This makes the strategy more effective, allowing organizations to avoid wasting money and resources on efforts that won’t benefit the company. Knowing the business objectives of a big data strategy also helps companies identify what data sources to focus on and what sources to steer clear from.

It’s the value that big data brings to an organization that makes it so crucial to properly use it. When creating a big data strategy, businesses need to make sure they view big data as a company-wide asset, one which everyone can use and take advantage of. Too often big data is seen as something meant solely for the IT department, but it can, in fact, benefit the organization as a whole. Big data shouldn’t be exclusive to only one group within a company. On the contrary, the more departments and groups can use it, the more valuable it becomes. That’s why big data strategies need a bigger vision for how data can be used, looking ahead to the long-term and avoiding narrowly-defined plans. This allows companies to dedicate more money and resources toward using big data, which helps them to innovate and use it to create new opportunities.

Another point all organizations need to consider is the kind of talent present in their companies. Data scientists are sought by businesses the world over because they can provide a significant boost to accomplishing established big data business goals. Data scientists are different from data analysts since they can actually build new data models, whereas analysts can only use models that have been pre-made. As part of a big data strategy, the roles and responsibilities of data scientists need to be properly defined, giving them the opportunity to help the organization achieve the stated business objectives. Finding a good data scientist with skills involving big data platforms and ad hoc analysis that are appropriate for the industry can be difficult with demand so high, but the value they can add is well worth it.

An organized and thoughtful big data strategy can often mean the difference between successful use of big data and a lot of wasted time, effort, and resources. Companies have a number of key considerations to account for when crafting their own strategies, but with the right mindset, they’ll know they have the right plans in place. Only then can they truly gain value from big data and propel their businesses forward.

Category: Business Intelligence
No Comments »

by: RickDelgado
18  Jun  2015

Flash Quietly Taking Over Disk in a Big Data World

Right now, we live in the big data era. What was once looked at as a future trend is now very much our present reality. Businesses and organizations of all shapes and sizes have embraced big data as a way to improve their operations and find solutions to longstanding problems. It’s almost impossible to overstate just how much big data has impacted the world in such a short amount of time, affecting everyone’s life whether we truly comprehend how. That means we live in a world awash in data, and as companies pursue their own big data strategies, they’ve had to rethink how to store all that information. Traditional techniques have proven unable to handle the huge amount of data being generated and collected on a daily basis. What once was dominated by hard disk drives (HDD) is now rapidly changing into a world driven by solid-state drives (SSD), otherwise known as flash storage.

For years, when talking of big data analytics, the assumption was that a business was using disk. There were several reasons for this, the main one being cost. Hard disk drives were simply cheaper, and for the most part they could deal with the increasing workloads placed upon them. The more data measured and generated, however, the more the limitations of HDD were unmasked. This new big data world needed a storage system capable of handling the workload, and thus the migration to flash storage began.

Many, including Gartner, peg 2013 as the year the switch really gained steam. Solid-state arrays had already been a storage strategy up until then, but in 2013 flash storage manufacturers began constructing arrays with new features like thin provisioning, deduplication, and compression. Suddenly, the benefits gained from using flash storage outweighed some of the drawbacks, most notably the higher cost. In a single year, solid-state arrays saw a surge in sales, increasing by more than 180 percent from 2012. With the arrival of flash storage to the mainstream, organizations could begin to replace their hard disk drives with a system more capable of processing big data.

And that’s really a main reason why flash storage has caught on so quickly. SSDs provide a much higher performance than the traditional storage options. Of particular note is the reduction in the time it takes to process data. Just one example of this is the experience from the Coca-Cola Bottling Co., which began collecting big data but was soon met by long delays in production due to having to sort through loads of new information. When the company adopted flash storage solutions, the amount of time needed to process data was cut dramatically. For example, processing jobs taking 45 minutes now only took six. These kind of results aren’t unique, which is why so many other businesses are seeking flash storage as their primary means of storing big data.

Many tech companies are responding to this increased demand by offering up more options in flash storage. SanDisk has recently unveiled new flash systems specifically intended to help organizations with their efforts in big data analytics. The new offerings are meant to be an alternative to the tiered storage often seen in data centers. Other major tech companies, such as Dell, Intel, and IBM, have shown similar support for flash storage, indicating the lucrative nature of offering flash solutions. The growth isn’t just being driven by private companies either; educational institutions have found a need for flash storage as well. MIT researchers announced last year that they would be switching from hard disks to flash storage in order to handle the demands of big data more effectively. The researchers determined that hard disk drives were too slow, so a better performing storage solution was needed.

As can be seen, flash storage has been quietly but surely taking over hard disk’s turf. That doesn’t mean hard disk drives will soon be gone for good. HDD will likely still be used for offline storage — mainly archiving purposes for data that doesn’t need to be accessed regularly. But it’s clear we’re moving into a world where solid-state drives are the most prevalent form of storage. The need to collect and process big data is making that happen, providing new, unique opportunities for all kinds of organizations out there.

Category: Enterprise Data Management
No Comments »

by: Robert.hillard
29  May  2015

Who are we leaving behind?

I was recently invited to deliver the keynote address at the University of Melbourne engineering and IT awards night.  I took the opportunity to challenge today’s students to think about the people being left behind as in the move to a digital economy.

Some 25 years ago I had the privilege of attending this university.  In thinking about tonight and the achievements of so many of you, I couldn’t help reflecting on the challenges my generation faced entering professional life in the 1990s and comparing them to what you will see in the decades ahead.

Wherever you turn in the media you are hearing the term “digital disruption”.  For those of us lucky enough to be educated in the so-called STEM disciplines of Science, Technology, Engineering and Mathematics, we probably feel empowered and excited by the talk of “digital disruption”.  How could we not look forward to using the technology of our time to replace the cars we drive with electric vehicles, our plastic credit cards with mobile wallets or the work cubicle of my generation with working flexibly at a local café.

We must not forget that the same sense of optimism is not necessarily true for societies around the world.  At a macro-level, where there is technological disruption, business disruption follows.  But where there is business disruption, social disruption is all too often the result.

Already email has all-but killed traditional mail.  Mainstream bookstores are a shadow of their former selves.  I can’t remember the last time I was in a record or CD store.  Hardly any businesses worry about advertising in the telephone directory.  Each of these changes has cost jobs and not everyone who lost out has found a role in the changing economy.

We have a responsibility to make sure that change is not only good for those of us lucky enough to have access to the right skills, but also to make change good for society as a whole.  Future generations will judge us by how we navigate the next two decades.

Change leaves winners and losers

We should be under no illusion, the changes we are going to go through in transport, energy and, particularly, financial services will leave far more at a disadvantage than any change that we have seen so far in the era of the Internet.

Does it matter that the next wave of innovation could see the end of local media content rules?  Does it matter that Department stores could be wiped out in Australia?  Does it matter that banks could be taken to the brink by a wave of fintech innovators, including peer-to-peer lending?

Take the last of these; replacing banks sounds an awful lot like risking the failure of the banks.  Whether it was in the nineteenth century, twentieth century or most recently in the GFC in this century, whenever banks have gone down many ordinary people have been badly hurt.

Disruption spurred-on by digital technology is extending into mainstream engineering, for example, batteries are looking to finally fill the gap of solar energy, providing stored base load power for when the sun isn’t shining and electric vehicles that hardly need servicing.  We could be just a few years away from houses going off-grid en masse.  These changes will leave car dealerships without a source of service income and power utilities without a market.

The concentration of wealth

The birth of the commercial Internet offered an opportunity for small business to compete with large companies with many of the advantages of scale and geography being removed.  At the turn of the century we saw an opportunity for a utopia of innovation spread across the globe with the rewards reaching far more people than ever before.

20 years in and the reality is not entirely aligned to this vision.  The network effect means that there are advantages to scale.  The more people that use the same search engine, the better its algorithms.  The more people that use the same music service the better its catalogue.  The more people that use the same social network, the greater its reach.

However, the evidence is that the groundswell of innovation is turning the tide with new money pouring into start-ups regardless of location and value being added in more locations that ever before.  That is where this room can lift its sights.  Seek to add value to the local economy where you and your family want to live and where you can see a society that you want to build.  Don’t be afraid to resist the pull to traditional centres of innovation. Don’t just think of your personal wealth but also of where your effort will contribute to your community.

Using technology as part of the solution

We in this room have the ability to channel our knowledge, skills and innovative flair to not only develop new applications of technology but also to corral and encourage the application of technology in such a way as to minimise unintended consequences and potential achieve new benefits for our society.

We can choose to enable a sharing economy, created through yet more innovations such as 99designs, Kaggle, Freelancer, Airbnb, DriveMyCar, and so many more, which provide income to a wider range of people using their available talent and resources.

We can choose to support local talent through the development of great ideas and then seeing them through to commercial success.  We can use automation and 3D printing to enable local manufacturing.  We can develop specialised services which are ready to be purchased by government and industry so that they are less dependent on imports.

I refuse to believe that we can’t use technology to improve access to capital while maintaining a safe financial system.  I am convinced that we can find better ways to access products and services without doing away with storefronts.  I know that we can make the move from fossil fuels to renewables while keeping a highly skilled engineering capability locally.

In conclusion

You will face different professional challenges to those that I faced as I approached the end of my education at this university.  You will be the shapers of society in the decades ahead.  You will help decide whether to throw ourselves headlong into technology-driven disruption or whether to keep a watch out for those who are left behind. You will decide whether you will be drawn into ever greater geographic concentration of innovation or if you will take the path to keeping value in the community you want to be part of.

I hope you will seek to make the right choices with the education you have worked so hard to earn.


Category: Enterprise2.0
No Comments »

by: Bsomich
18  May  2015

MIKE2.0 Community Update

Click to view this email in a browser



Missed what’s been happening in the MIKE2.0 data management community? Check out our latest update:


How Do You Define Your Master Data? 

There are numerous definitions for “master data” ranging from one sentence to a few paragraphs.  This is perhaps the most straightforward one I’ve come across:

Master data is the core data that is essential to operations in a specific business or business unit. - via

A clear and simple definition, yet a lot of companies often struggle to adhere to it when identifying and qualifying master data for their organizations.

Why do you think this is?

Although data is often looked at on a transactional basis, master data typically makes up a large a percentage of the data elements in any given transaction. Common examples of master data include:

  • Customer data (name, contact details, DOB, customer classification)
  • Locality data (physical address, postal address, geographical data)
  • Product data (item number, bill of materials, product codes)
  • Employee data (employee number, role, placement in organisational structure)
  • Partner data (partner name, classification)

It is not unusual for this same data to be held in dozens or even hundreds of applications across a large organization, and may be difficult to isolate and collect.   Much of the data has been held in legacy systems for years and may be held in a fashion where data is poorly integrated and at low levels of quality.  Many organizations have poorly implemented Data Governance processes to handle changes in this data over time.

MIKE2.0 offers an open source solution for managing master data that outlines many of the issues organizations have with identifying it.

We hope you find this offering of benefit and welcome any suggestions you may have to improve it.


MIKE2.0 Community

Popular Content

Did you know that the following wiki articles are most popular on Google? Check them out, and feel free to edit or expand them!

What is MIKE2.0?
Deliverable Templates
The 5 Phases of MIKE2.0
Overall Task List
Business Assessment Blueprint
SAFE Architecture
Information Governance Solution

Contribute to MIKE:

Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Content Model
MIKE2.0 Governance

Join Us on

Follow Us on
43 copy.jpg

Join Us on


This Week’s Blogs for Thought:

5 Unusual Ways Businesses are Using Big Data

Big data is where it’s at. At least, that’s what we’ve been told. So it should come as no surprise that businesses are busy imagining ways they can take advantage of big data analytics to grow their companies. Many of these uses are fairly well documented, like improving marketing efforts, or gaining a better understanding of their customers, or even figuring out better ways to detect and prevent fraud. The most common big data use cases have become an important part of industries the world over, but big data can be used for much more than that. In fact, many companies out there have come up with creative and unusual uses for big data analytics, showing just how versatile and helpful big data can be.

Read more.

Cloud Computing and the Industries that Love It

Cloud computing provides greater security, virtually unlimited computing resources for research and development, cost savings, and advanced threat detection methods. With so many reasons to use cloud computing, it’s no wonder many industries have flocked to the new technology. Cloud technology serves as a form of outsourcing for companies, where some data is kept in house for better control, and other data is trusted to a third-party provider. Each industry that benefits from cloud computing has their own specific reasons for adopting the technology, but cloud computing is most profitable for companies that work with emerging markets and need quick and cost effective scalability.

Read more.
Is Your Data Quality Boring? 

Let’s be honest here. Data Quality is good and worthy, but it can be a pretty dull affair at times. Information Management is something that “just happens”, and folks would rather not know the ins-and-outs of how the monthly Management Pack gets created. Yet I’ll bet that they’ll be right on your case when the numbers are “wrong.” Right?

So here’s an idea. The next time you want to engage someone in a discussion about data quality, don’t start by discussing data quality. Don’t mention the processes of profiling, validating or cleansing data. Don’t talk about integration, storage or reporting. And don’t even think about metadata, lineage or auditability.

Read more.

Forward this message to a friendQuestions?

If you have any questions, please email us at 


Category: Information Development
No Comments »

Collapse Expand Close
TODAY: Thu, October 19, 2017
Collapse Expand Close
Recent Comments
Collapse Expand Close