Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.
by: Bsomich
20  Feb  2015

MIKE2.0 Community Update

Missed what’s been happening in the MIKE2.0 community? Read on!

 

  
 logo.jpg

Business Drivers for Better Metadata Management

There are a number Business Drivers for Better Metadata Management that have caused metadata management to grow in importance over the past few years at most major organisations. These organisations are focused on more than just a data dictionary across their information – they are building comprehensive solutions for managing business and technical metadata.

Our wiki article on the subject explores many factors contributing to the growth of metadata and guidance to better manage it:   

Feel free to check it out when you have a moment.

Sincerely,MIKE2.0 Community

Contribute to MIKE:

Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
images.jpg

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.

 

This Week’s Blogs for Thought:

Keeping Big Data Secure: Should You Consider Data Masking?  

Big data is a boon to every industry. And as data volumes continue their exponential rise, the need to protect sensitive information from being compromised is greater than ever before. The recent data breach of Sony Pictures, and new national threats from foreign factions serve as a cautionary tale for government and private enterprise to be constantly on guard and on the lookout for new and better solutions to keep sensitive information secure.

Read more.

The Architecture After Cloud

I think that Zach Nelson (Netsuite’s CEO) was wrong when he said that “cloud is the last computing architecture” but I also believe that his quote is a healthy challenge to take computing and business architectures to a new level. Nelson went on to say “I don’t think there is anything after it (cloud). What can possibly be after being able to access all of your data any time, anywhere and on any device? There is nothing.”  His comments are available in full from an interview with the Australian Financial Review. Aside from cloud, our industry has had a range of architectures over the decades including client/server, service oriented architecture (SOA) and thin client.  Arguably design patterns such as 4GLs (fourth generation languages) and object oriented programming are also architectures in their own right. I think that we can predict attributes of the next architecture by looking at some of the challenges our technologies face today.Read more.

MSP Cloud Computing Strategies to Consider in 2015

Managed service providers (MSP) have some difficult decisions to make in the coming year. Many of the pressing questions they’re facing revolve around cloud computing as the cloud has become a technology now being embraced by mainstream businesses of all sizes and types. Years ago, the dilemma surrounding the cloud was a relatively easy one to address, especially when clients were asking questions about what is cloud computing and how it could ultimately benefit their organizations.Read more.

  Forward this message to a friend

 

 

Category: Information Development
No Comments »

by: RickDelgado
06  Feb  2015

MSP Cloud Computing Strategies to Consider in 2015

Managed service providers (MSP) have some difficult decisions to make in the coming year. Many of the pressing questions they’re facing revolve around cloud computing as the cloud has become a technology now being embraced by mainstream businesses of all sizes and types. Years ago, the dilemma surrounding the cloud was a relatively easy one to address, especially when clients were asking questions about what is cloud computing and how it could ultimately benefit their organizations. That was then, but Gartner now predicts that by 2016, organizations will store more than a third of their content on the cloud. Most clients are serious about adopting the technology, and they want their MSPs to make it happen. That means MSPs need to make sure their clients get it in 2015, but there’s no foolproof way to do it. The following are several strategies and methods to consider for successfully fulfilling the client’s demands for the cloud.

 

The first thing every managed service providers needs to have is a plan. That may seem unnecessarily basic, but it’s true that many MSPs don’t have a plan in place before delivering cloud computing to the client. MSPs need to craft a roadmap, something that will guide them through every step in the process. This isn’t just common sense; it saves on plenty of time and resources the further into the strategy you go. In many cases, planning can mean the difference between a successful cloud strategy and one that fails the client.

 

Of course, a plan is only the first step MSPs need to take. The next is to identify the right type of strategy that works best for the client. No matter the choice, it’s important for managed service providers to make it clear to the client what will happen, then deliver on those promises. In any scenario, the quality of service should still rank as a top priority, showing the client the worth of the MSP during a time of transition. But when it comes to offering a cloud computing service, there are a number of different strategies to think about. The first is a risk-taking strategy. A MSP that goes with this route actually foregoes partnering with any cloud provider and instead builds a private cloud. The main advantage of using this strategy is that it allows the MSP to retain absolute control over every aspect of the cloud. The downside is financial in nature. Building a cloud is a sizeable investment, and it will take a long time (around three years) to see a return on that investment.

 

A second strategy is more conservative in nature. The conventional method is to look for a major cloud provider like Amazon or Microsoft to make the cloud a reality for the cloud. This strategy makes it easier for clients to buy into cloud computing, plus there’s very little risk. The disadvantage for the MSP is that competition among these cloud providers is intense and growing increasingly brutal. The third strategy is known as the trailblazer option, which features the MSP seeking a white-label provider. This means the MSP can offer enterprise-grade infrastructure at a low cost while maintaining a more agile operation. This added flexibility is particularly attractive to clients.

 

Once a strategy is decided upon, it’s time to start the process of cloud onboarding. This part of the plan can also be difficult and fraught with pitfalls, but MSPs should know about three steps that can maximize the chance for success. All MSPs need to properly analyze the business, technical, and application abilities of the client, which will help the MSP know how to proceed. The transition phase helps to lay out a blueprint, mapping out expectations as the move to the cloud is made, and pointing out the overall impact it will have on the organization. Once the onboarding is complete, MSPs should also do ongoing performance analyses in order to optimize the applications that are now on the cloud.

 

Offering the cloud to clients can be a risky endeavor, but it’s a risk worth taking. Clients are much more interested in cloud computing than they were a few years ago, and managed service providers need to recognize this significant change. By following some of these steps and preparing the right strategy, MSPs can make sure the move to the cloud is a smooth and productive one.

 

Category: Information Development
No Comments »

by: Robert.hillard
25  Jan  2015

The architecture after cloud

I think that Zach Nelson (Netsuite’s CEO) was wrong when he said that “cloud is the last computing architecture” but I also believe that his quote is a healthy challenge to take computing and business architectures to a new level.

Nelson went on to say “I don’t think there is anything after it (cloud). What can possibly be after being able to access all of your data any time, anywhere and on any device? There is nothing.”  His comments are available in full from an interview with the Australian Financial Review.

Aside from cloud, our industry has had a range of architectures over the decades including client/server, service oriented architecture (SOA) and thin client.  Arguably design patterns such as 4GLs (fourth generation languages) and object oriented programming are also architectures in their own right.

I think that we can predict attributes of the next architecture by looking at some of the challenges our technologies face today.  Some of these problems have defined solutions while others will require inventions that aren’t yet on the table.

Security

Near the top of any list is security.  Our Internet-centred technology is increasingly exposed to viruses, hackers and government eavesdroppers.

One reason that so much of our technology is vulnerable is that most operating systems share code libraries between applications.  The most vulnerable application can leave the door open for malicious code to compromise everything running on the same machine.  This is part of the reason that the Apple ecosystem has been less prone to viruses than the Windows platform.  Learning from this, it is likely that the inefficiency of duplicating code will be a small price to pay for siloing applications from each other to reduce the risk of cross-infection.

At the same time our “public key” encryption is regarded by many as being at risk from increasing computing power for brute force code cracking and even the potential of future quantum computers.

Because there is no mathematical proof that encryption based on the factoring of large numbers won’t be cracked in the future it can be argued to be unsuitable for a range of purposes such as voting.  A future architecture might consider more robust approaches such the physical sharing of ciphers.

Privacy

Societies around the world are struggling with defining what law enforcement and security agencies should and shouldn’t have access to.  There is even a growing debate about who owns data.  As different jurisdictions converge on “right to be forgotten” legislation, and societies agree on what back doors keys to give to various agencies, future architectures will be key to simplifying the management of the agreed approaches.

The answers will be part of future architectures with clearer tracking of metadata (and indeed definitions of what metadata means).  At the same time, codification of access to security agencies will hopefully allow users to agree with governments about what can and can’t be intercepted.  Don’t expect this to be an easy public debate as it has to navigate the minefield of national boundaries.

Network latency

Another issue that is topical to application designers is network latency.  Despite huge progress in broadband across the globe, network speeds are not increasing at the same rate as other aspects of computing such as storage, memory or processing speeds.  What’s more, we are far closer to fundamental limits of physics (the speed of light) when managing the transmission from servers around the world.  Even the most efficient link between New York and London would mean the round-trip for an instruction response is 0.04 seconds at a theoretical maximum of the speed of light (with no latency for routers or a network path that is not perfectly straight).  In computing terms 0.04 seconds is the pace of a snail!

The architectural solution has already started to appear with increasing enthusiasm for caching and on-device processing.  Mobile apps are a manifestation of this phenomenon which is also sometimes called edge computing.

The cloud provides the means to efficiently synchronise devices, with the benefits of computing on devices across the network with cheap and powerful processing and masses of data right on-hand.  What many people don’t realise is that Internet Service Providers (ISPs) are already doing this by caching popular YouTube and other content which is why it seems like some videos take forever while others play almost instantly.

Identity

Trying to define the true nature of a person is almost a metaphysical question.  Am I an individual, spouse, family member or business?  Am I paying a bill on my own behalf or doing the banking for my business partners?  Any future architecture will build on today’s approaches and understanding of user identity.

Regardless of whether the answer will be biometrics, social media or two-factor authentication it is likely that future architectures will make it easier to manage identity.  The one thing that we know is that people hate username/password management and a distributed approach with ongoing verification of identity is more likely to gain acceptance (see Login with social media).

Governments want to own this space but there is no evidence that there is a natural right for any one trusted organisation.  Perhaps Estonia’s model of a digital economy and e-residency might provide a clue.  Alternatively, identity could become as highly regulated as national passports with laws preventing individuals from holding more than one credential without explicit permission.

Internet of Things

The Internet of Things brings a new set of challenges with an explosion of the number of connected devices around the world.  For anyone who is passionate about technology, it has been frustrating that useful interfaces have been so slow to emerge.  We were capable of connecting baby monitors, light switches and home security for years before they were readily available and even today they are still clumsy.

Arguably, the greatest factor in the lag between the technological possibility and market availability has been the challenge of interoperability and the assumption that we need standards.  There is a growing belief that market forces are more effective than standards (see The evolution of information standards).  Regardless, the evolution of new architectures is essential to enabling the Internet of Things marathon.

Complexity and Robustness

Our technology has become increasingly complex.  Over many years we have layers building on layers of legacy hidden under facades of modern interfaces.  Not only is this making IT orders of magnitude more expensive, it is also making it far harder to create bulletproof solutions.

Applications that lack robustness are frustrating when they stop you from watching your favourite TV programme, but they could be fatal when combined with the Internet of Things and increasingly autonomous machines such as trains, planes and automobiles.

There is an increasing awareness that the solution is to evolve to simplicity.  Future architectures are going to reward simplicity through streamlined integration of services to create applications across platforms and vendors.

Bringing it all together

The next architecture won’t be a silver bullet to all of the issues I’ve raised in this post, but to be compelling it will need to provide a platform to tackle problems that have appeared intractable to-date.

Perhaps, though, the greatest accelerators of every generation of architecture is a cool name (such as “cloud”, “thin client”, “service oriented architecture” or “client/server”).  Maybe ambitious architects should start with a great label and work back to try and tackle some of the challenges that technologists face today.

Category: Enterprise2.0
No Comments »

by: RickDelgado
21  Jan  2015

Where to Go to Learn About Network Security

The one thought on seemingly every business’s mind is network security. This is understandable given the damaging attacks launched on companies like Home Depot, Target, and Sony in the past year-and-a-half or so. After seeing the effects of data breaches involving large corporations, businesses are taking their security more seriously now than ever before, and many are quickly realizing it’s far from an easy task. The number of cyber threats out there seem to multiply every day, and with them the different methods and techniques they can use to breach a company’s security measures and obtain sensitive data from within. In this ever-evolving environment, it’s important that all business executives and IT security personnel stay up to date on the latest security information and learn all they can as they train to combat these threats. The question then becomes where you should go for the best in network security information and training. Luckily, the resources available are many.

 

When looking at materials and sources that can provide the most helpful information in improving network security, many companies will take cost into account. Some resources do have a cost associated with them, while others are perfectly free. Determining which to use will largely come down to what your company’s budget is for security purposes. Don’t let the price tag affect your opinion on the quality of the source. There are plenty of helpful security resources on the internet that are available at no cost.

 

Take blogs dedicated to the topic of network security, for instance. Blogs are an excellent way to find out what the latest news on the security front is. Of course cyber attacks hitting mega businesses like Target or JPMorgan Chase will almost always be front page news, but what about smaller breaches affecting lesser known companies? That’s where network security blogs come in. With blogs as a frequent resource, you can find out what kind of new threats are out there, how they’re impacting companies, and what’s being done to stop them. Network security blogs can also help you learn about new industry trends you’ll probably want to try for your company. Blogs can also give overviews of new security products and software. Some blogs are maintained by experts in the network security field, like Schneier on Security, and others come from organizations and institutes that dedicate themselves to improving network security for all organizations (the SANS Security Blogs are good examples of this type). Companies in the security field can have their own blogs as well, and though they can provide helpful information, keep in mind they’re also aimed at pushing their products.

 

Another free resource that shouldn’t be overlooked is YouTube. It’s one thing to read about new effective security procedures; it’s another thing to see it in action via an easy-to-follow video. The number of YouTube videos dedicated to network security is vast, but choosing which ones to view is similar to choosing which blogs to follow. Most prominent are those from security companies with a lot to gain from the more eyes they attract. While it’s often assumed that security companies focus only on videos promoting their products, many do offer educational videos that can help you learn more about network security and what’s being done to improve it. Some YouTube channels you’ll want to keep an eye on include the free video tutorials offered by CBT Nuggets and the useful videos from McAfee, which is now part of Intel Security. One word of caution: always check the upload date of the video. A video from five years ago may not have the updated information needed to fight today’s security threats.

 

Beyond the free sources, there are also learning opportunities you can pay for. Many educational institutions have online courses you can take that will give you the skills and knowledge that will help you know what to do for you company’s network security. Gaining these certifications does require extra work, some or most of it likely to be done outside the office, but it can all be worth it in the end. If the goal is to prevent security breaches, combining courses with blogs and online videos can make that goal far more achievable. Look at all of these resources as tools you can add to your arsenal of knowledge and expertise. You and your company will be much better off because of your dedication to learning all you can.

 

Category: Information Development
No Comments »

by: Gil Allouche
15  Jan  2015

Keeping Big Data Secure: Should You Consider Data Masking?

Big data is a boon to every industry. And as data volumes continue their exponential rise, the need to protect sensitive information from being compromised is greater than ever before. The recent data breach of Sony Pictures, and new national threats from foreign factions serve as a cautionary tale for government and private enterprise to be constantly on guard and on the lookout for new and better solutions to keep sensitive information secure.

One security solution, “data masking”, is the subject of a November 2014 article on Nextgov.com.

In the article, Ted Girard, a vice president at Delphix Federal, defines what data masking is—along with its applications in the government sector. Being that data masking also has non-government applications, organizations wondering if this solution is something they should consider for original production data should find the following takeaways from the Nextgov article helpful.

The information explosion
Girard begins by stating the plain and simple truth that in this day and age of exploding volumes of information, “data is central to everything we do.” That being said he warns that, “While the big data revolution presents immense opportunities, there are also profound implications and new challenges associated with it.” Among these challenges, according to Girard, are protecting privacy, enhancing security and improving data quality. “For many agencies just getting started with their big data efforts”, he adds, “these challenges can prove overwhelming.”

The role of data masking
Speaking specifically of governmental needs to protect sensitive health, education, and financial information, Girard explains that data masking is, “a technique used to ensure sensitive data does not enter nonproduction systems.” Furthermore, he explains that data masking is, “designed to protect the original production data from individuals or project teams that do not need real data to perform their tasks.” With data masking, so-called “dummy data”— a similar but obscured version of the real data—is substituted for tasks that do not depend on real data being present.

The need for “agile” data masking solutions
As Girard points out, one of the problems associated with traditional data masking is that, “every request by users for new or refreshed data sets must go through the manual masking process each time.” This, he explains, “is a cumbersome and time-consuming process that promotes ‘cutting corners’– skipping the process altogether and using old, previously masked data sets or delivering teams unmasked versions.” As a result, new agile data masking solutions have been developed to meet the new demands associated with protecting larger volumes of information.

According to Girard, the advantage of agile data masking is that it, “combines the processes of masking and provisioning, allowing organizations to quickly and securely deliver protected data sets in minutes.”

The need for security and privacy
As a result of collecting, storing and processing sensitive information of all kinds,
government agencies need to keep that information protected. Still, as Girard points out, “Information security and privacy considerations are daunting challenges for federal agencies and may be hindering their efforts to pursue big data programs.” The good news with “advance agile masking technology”, according to Girard, is that it helps agencies, “raise the level of security and privacy assurance and meet regulatory compliance requirements.” Thanks to this solution, Girard says that, “sensitive data is protected at each step in the life cycle automatically.”

Preserving data quality
Big data does not necessarily mean better data. According to Girard, a major cause of many big data project failures is poor data. In dealing with big data, Girard says that IT is faced with two major challenges:

1. “Creating better, faster and more robust means of accessing and analyzing large data sets…to keep pace.”
2. “Preserving value and maintaining integrity while protecting data privacy….”

Both of these challenges are formidable, especially with large volumes of data migrating across systems. As Girard explains, “…controls need to be in place to ensure no data is lost, corrupted or duplicated in the process.” He goes on to say that, “The key to effective data masking is making the process seamless to the user so that new data sets are complete and protected while remaining in sync across systems.”

The future of agile data masking
Like many experts, Girard predicts that big data projects will become a greater priority for government agencies over time. Although not mentioned in the article, the NSA’s recent installation of a massive 1.5 billion-dollar data center in Utah serves as a clear example of the government’s growing commitment to big data initiatives. In order to successfully analyze vast amounts of data securely and in real time going forward, Girard says that agencies will need to, “create an agile data management environment to process, analyze and manage data and information.”

In light of growing security threats, organizations looking to protect sensitive production data from being compromised in less-secure environments should consider data masking as an effective security tool for both on-premise and cloud-based big data platforms.

Category: Enterprise Data Management
No Comments »

by: Bsomich
10  Jan  2015

MIKE2.0 Community Update

Missed what’s been happening in the MIKE2.0 data management community? Read on!

 

 logo.jpg

What is an Open Methodology Framework? 

An Open Methodology Framework is a collaborative environment for building methods to solve complex issues impacting business, technology, and society.  The best methodologies provide repeatable approaches on how to do things well based on established techniques. MIKE2.0′s Open Methodology Framework goes beyond the standards, techniques and best practices common to most methodologies with three objectives:

  • To Encourage Collaborative User Engagement
  • To Provide a Framework for Innovation
  • To Balance Release Stability with Continuous Improvement

We believe that this approach provides a successful framework accomplishing things in a better and collaborative fashion. What’s more, this approach allows for concurrent focus on both method and detailed technology artifacts. The emphasis is on emerging areas in which current methods and technologies lack maturity.

The Open Methodology Framework will be extended over time to include other projects. Another example of an open methodology, is open-sustainability which applies many of these concepts to the area of sustainable development. Suggestions for other Open Methodology projects can be initiated on this article’s talk page.

We hope you find this of benefit and welcome any suggestions you may have to improve it.

Sincerely,

MIKE2.0 Community

Popular Content

Did you know that the following wiki articles are most popular on Google? Check them out, and feel free to edit or expand them!

What is MIKE2.0?
Deliverable Templates
The 5 Phases of MIKE2.0
Overall Task List
Business Assessment Blueprint
SAFE Architecture
Information Governance Solution

Contribute to MIKE:

Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
 images.jpg

 

This Week’s Blogs for Thought:

Security Concerns for the IoT in Healthcare

It’s easy to get caught up in all the hype surrounding the latest buzzwords in technology. The Internet of Things (IoT) is one such term many insiders are talking about excitedly, and it’s already making a big difference in certain industries. The idea of having objects capable of collecting data while connected to the internet while also communicating with each other is an intriguing one that’s poised to explode. Gartner is even predicting that the number of devices connected to the IoT will grow to 26 billion by the year 2020. Industries all over the world are investigating how the Internet of Things can benefit them, and the healthcare industry is no exception.

Read more.

The Internet of Misfit Things

One of the hallmarks of the holiday season is Christmas television specials. This season I once again watched one of my favorite specials, Rudolph the Red-Nosed Reindeer. One of my favorite scenes is the Island of Misfit Toys, which is a sanctuary for defective and unwanted toys.

This year the Internet of Things became fit for things for tracking fitness. While fitness trackers were among the hottest tech gifts this season, it remains to be seen whether these gadgets are little more than toys at this point in their evolution.Read more.

Our Machines Will Not Outsmart Us

Over the millennia we have been warned that the end of the world is nigh.  While it will no doubt be true one day, warnings by Stephen Hawking in a piece he co-authored on artificial intelligence don’t fill me with fear (See Transcending Complacency on Superintelligent Machines).  I disagree with the commentators across the board who are warning that the machine will outsmart us by the 2030s and that it could become a Terminator-style race between us and them.

Hawking and his co-authors argue that “[s]uccess in creating AI would be the biggest event in human history.  Unfortunately, it might also be our last unless we learn how to avoid the risks.”

Read more.

 

Forward to a Friend!

Category: Information Development
No Comments »

by: RickDelgado
08  Jan  2015

Security Concerns for the Internet of Things in Healthcare

It’s easy to get caught up in all the hype surrounding the latest buzzwords in technology. The Internet of Things (IoT) is one such term many insiders are talking about excitedly, and it’s already making a big difference in certain industries. The idea of having objects capable of collecting data while connected to the internet while also communicating with each other is an intriguing one that’s poised to explode. Gartner is even predicting that the number of devices connected to the IoT will grow to 26 billion by the year 2020. Industries all over the world are investigating how the Internet of Things can benefit them, and the healthcare industry is no exception. But as much as the IoT can lead to some notable advantages in healthcare, a number of concerns have arisen that could derail any momentum. These concerns need to be addressed if those in healthcare will want to get the most out of the IoT.

 

While some ways the IoT can help in healthcare are still theoretical, a good number of devices and practices are already being used to positive effect. Health monitors have become much more versatile and common, like, for example, remote heart rate monitors used at hospitals. What sets these monitors apart is how, via their connection through the IoT, they can communicate with other medical devices and medical personnel to send out alerts in the event that vital signs fall into dangerous territory. Another new technology associated with the Internet of Things is smart beds. These beds detect when a patient is resting on them and can notify the hospital when the patient tries to get up. Smart beds can also take data and modify the pressure being put on the patient to allow for maximum comfort and recovery.

 

Much of the aim of IoT-connected devices lies in decreasing the amount of direct face-to-face interaction patients need with their doctors. IoT devices also help to keep doctors and nurses informed by collecting and supplying large amounts of data, helping them track health information at a level never before realized. The Internet of Things can also drastically reduce the amount of healthcare device downtime by alerting manufacturers, technicians, and managers when devices are about to break down. Parts can be shipped and repair technicians can be summoned almost immediately, and it’s thanks to the sensors being embedded within the devices themselves.

 

While all of these breakthroughs point to added convenience and improved patient health, there is still a significant danger many experts are warning about. In light of the recent hackings hitting major companies over the last few years, security issues remain a top concern that has made hospitals reluctant to truly dive into the IoT’s potential. Perhaps the main overriding concern deals particularly with securing health data and patient privacy. By using devices that are constantly connected to the internet and having them store, process, and transfer health data, the danger of having hackers gain possession of sensitive information is very real. In some cases, health data is more valuable on the black market than even bank account information, and considering that networked devices can be vulnerable to hacking likely has many in the healthcare industry approaching the issue cautiously. Even more serious is the potential for hackers to actually infiltrate devices and use them to attack the patient. This has already been demonstrated in certain environments, and while no actual attacks have been recorded yet, the danger is real and needs to be addressed.

The Internet of Things isn’t going away anytime soon, so the healthcare industry will need to focus on improving IT security as it adopts connected health devices. This may require constant updates and patching to the devices, ensuring that any security holes are plugged up and minimizing the possibility of cyber attacks. Even the federal government is getting involved in the form of the FDA issuing guidelines and standards that all networked medical devices need to meet before being used. Some hospitals are even going as far as internally hosting data to maintain more control, although that does somewhat limit IoT devices. Whatever method is used, the Internet of Things could revolutionize the healthcare industry. Caution is needed, however, to ensure patients’ security is protected every step of the way.

Category: Information Development
No Comments »

by: Ocdqblog
29  Dec  2014

The Internet of Misfit Things

One of the hallmarks of the holiday season is Christmas television specials. This season I once again watched one of my favorite specials, Rudolph the Red-Nosed Reindeer. One of my favorite scenes is the Island of Misfit Toys, which is a sanctuary for defective and unwanted toys.

This year the Internet of Things became fit for things for tracking fitness. While fitness trackers were among the hottest tech gifts this season, it remains to be seen whether these gadgets are little more than toys at this point in their evolution. Right now, as J.C. Herz blogged, they are “hipster pet rocks, devices that gather reams of largely superficial information for young people whose health isn’t in question, or at risk.”

Some describe big data as reams of largely superficial information. Although fitness and activity trackers count things such as steps taken, calories burned, and hours slept, as William Bruce Cameron remarked, in a quote often misattributed to Albert Einstein, “Not everything that can be counted counts, and not everything that counts can be counted.”

One interesting count is the use of these trackers. “More than half of US consumers,” Herz reported, “who have owned an activity tracker no longer use it. A third of them took less than six months from unboxing the device to shoving it in a drawer or fobbing it off on a relative.” By contrast, “people with chronic diseases don’t suddenly decide that they’re over it and the novelty has worn off. Tracking and measuring—the quantified self—is what keeps them out of the hospital.”

Unfortunately, even though these are the people who could most benefit from fitness and activity trackers, manufacturers, Herz lamented, “seem more interested in helping the affluent and tech-savvy sculpt their abs and run 5Ks than navigating the labyrinthine world of the FDA, HIPAA, and the other alphabet soup bureaucracies.”

In the original broadcast of Rudolph the Red-Nosed Reindeer in 1964, Rudolph failed to keep his promise to return to the Island of Misfit Toys and rescue them. After the special aired, television stations were inundated with angry letters from children demanding that the Misfit Toys be helped. Beginning the next year, the broadcast concluded with a short scene showing Rudolph leading Santa back to the island to pick up the Misfit Toys and deliver them to new homes.

For fitness and activity trackers to avoid the Internet of Misfit Things, they need to make—and keep—a promise to evolve into wearable medical devices. Not only would this help the healthcare system allay the multi-trillion dollar annual cost of chronic disease, but it would also allow manufacturers to compete in the multi-billion dollar medical devices market. This could bring better health and bigger profits home for the holidays next year.

 

Category: Information Development
No Comments »

by: Robert.hillard
20  Dec  2014

Our machines will not outsmart us

Over the millennia we have been warned that the end of the world is nigh.  While it will no doubt be true one day, warnings by Stephen Hawking in a piece he co-authored on artificial intelligence don’t fill me with fear (See Transcending Complacency on Superintelligent Machines).  I disagree with the commentators across the board who are warning that the machine will outsmart us by the 2030s and that it could become a Terminator-style race between us and them.

Hawking and his co-authors argue that “[s]uccess in creating AI would be the biggest event in human history.  Unfortunately, it might also be our last unless we learn how to avoid the risks.”  They go on to compare artificial intelligence (AI) to an alien life form of superior intelligence who would owe us no comfort or future on this planet.

These comments relate to the so-called “singularity”, a term popularised by writer Vernor Vinge where, sometime in the vicinity of the 2030s, AI can out think humans.

I have previously written about the limits of current AI research (see Your insight might protect your job).  Although current techniques (which I argue are the second generation of AI) cannot scale to provide the cognitive leaps that are necessary for real insight, it would be wrong to assume that a third generation isn’t on the horizon.

Despite the potential for a third (yet to be imagined) generation of technology for AI, there are three reasons why I disagree that such machines will take over the world or even outsmart us.

1. It’s all about the user interface

Simply applying Big Data analytics to the content of the internet will not create a machine that is smarter than us.  If a machine is to be of our world, it needs to be able to interact with it through a user interface.

The human brain only works well in conjunction with opposable thumbs, something that few other intelligent animals can’t compete with us on.  Regardless of how intelligent a dolphin is, the lack of a good interface means that it can’t manipulate the world around it and in-turn learn from these interactions.

Like previous generations of computing, it is all about the user interface.  Robotics is likely to overcome these constraints but current predictions of the internet coming alive due to its complexity are fanciful.  Far from running out of our control, we are re-architecting our technology to remove the risk of runaway complexity by segmenting the systems that touch our physical world.  This segmentation is like cutting off the closest thing that the internet has to opposable thumbs.

2. We will become the machine

Information, knowledge and intelligence directly equate to power.  Humans never give-up power easily and choose political alliances with adversaries rather than cede control.

Any competition between humans and machines is likely to follow the same lines.  Rather than cede to the machine, we will join with them.  I’ve previously written about what might become the first direct neural interfaces (see Will the bionic eye solve information overload?).  It is inconceivable that we won’t choose to augment our own brains with the internet in the coming decades.

Such a future virtually guarantees supremacy of our species against any machine competition, but it does paint a future which is perhaps uncomfortable from our vantage point today.

3. We aren’t their competitors

Despite what you might read, we live the majority of our lives in the physical world.  We eat food, enjoy socialising in person and interact with our hobbies in three dimensions.

Our computers live almost entirely in memory of the machines that we have made.  They are creatures of the internet.  While we visualise the internet through our browsers, apps and other tools, we are visitors to this space.  We twist its contents to represent metaphors of the physical world (for example, paper for writing on and rooms for meeting in).

Some scientists argue that the virtual world is an entirely valid reality.  Nick Bostrom has even gone so far as to wonder whether our own universe is the virtual world of some supercomputer experiment being run by an alien life form (see Are you living in a computer simulation?)  If that is the case, we need to be very afraid of the alien “off” switch!

Regardless of the simulation argument, any virtual reality of the internet where AI may take shape is not our reality.  It is as if we are of different universes, but like the multiverse that is regaining popularity in theoretical physics, we do have an increasingly symbiotic relationship.

Category: Enterprise2.0
No Comments »

by: Bsomich
20  Dec  2014

MIKE2.0 Community Update

Missed what’s been happening in the MIKE2.0 data management community? Read on!

 

Click to view this email in a browser

  
 logo.jpg

Available for Order: Information Development Using MIKE2.0

Have you heard? Our new book, “Information Development Using MIKE2.0” is available for order. 

The vision for Information Development and the MIKE2.0 Methodology have been available in a collaborative, online fashion since 2006, and are now made available in print publication to a wider audience, highlighting key wiki articles, blog posts, case studies and user applications of the methodology. 

Authors for the book include Andreas Rindler, Sean McClowry, Robert Hillard, and Sven Mueller, with additional credit due to Deloitte, BearingPoint and over 7,000 members and key contributors of the MIKE2.0 community. The book has been published in paperback as well as all major e-book publishing platforms. 

Get Involved:

To get your copy of the book, visit our order page on Amazon.com. For more information on MIKE2.0 or how to get involved with our online community, please visit www.openmethodology.org.

Sincerely,

MIKE2.0 Community  

Contribute to MIKE:

Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
 images.jpg

 

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.

 

This Week’s Blogs for Thought:

Keep Your Holidays Merry and Bright: Don’t Delay in Improving Security

With the holidays upon us, most businesses are dealing with what is usually their busiest time of the year. It’s a period of excitement and increased sales, but it’s also a time of worry and concern. In the wake of the recent data breaches at large retailers like Target and Home Depot, many businesses are approaching the holidays with a more cautious attitude, particularly toward security. Hackers have the potential to steal data and cause millions of dollars in damages, essentially crippling any business no matter their size. What’s even more alarming is that many companies haven’t responded effectively to the threat of security breaches.

Read more.

Pros and Cons of Hadoop and Cloud Providers

Selecting a big data solution can be tricky at times with the different options available for enterprises. Deciding between a cloud big data analytics provider and an on-premise Hadoop solution comes down to recognizing the pros and cons of both options and how they will affect the bottom line.

Read more.

Open Data Gray Areas 

In a previous post, I discussed some data quality and data governance issues associated with open data. In his recent blog post How far can we trust open data?, Owen Boswarva raised several good points about open data.
“The trustworthiness of open data,” Boswarva explained, “depends on the particulars of the individual dataset and publisher. Some open data is robust, and some is rubbish. That doesn’t mean there’s anything wrong with open data as a concept. The same broad statement can be made about data that is available only on commercial terms. But there is a risk attached to open data that does not usually attach to commercial data.”

Read more.

Forward to a Friend!

Know someone who might be interested in joining the Mike2.0 Community? Forward this message to a friend

Questions?

If you have any questions, please email us at mike2@openmethodology.org

 

Category: Information Development
No Comments »

Calendar
Collapse Expand Close
TODAY: Fri, July 3, 2015
July2015
SMTWTFS
2829301234
567891011
12131415161718
19202122232425
2627282930311
Archives
Collapse Expand Close
Recent Comments
Collapse Expand Close