Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.
by: Bsomich
18  Oct  2014

MIKE2.0 Community Update

 

Missed what’s been happening in the MIKE2.0 information management community? Check out our bi-weekly update:

 
 logo.jpg

Getting Started with the Five Phases of MIKE2.0 

The MIKE2.0 Methodology has abandoned the traditional linear or waterfall approach to systems development in favor of an iterative, agile approach called continuous implementation. This approach divides the development and rollout of anentire system into a series of implementation cycles. These cycles identify and prioritize the portions of the system that can be constructed and rolled out before the entire system is complete. Each cycle also includes

  • A feedback step to evaluate and prioritize the implementation results
  • Strategy changes
  • Improvement requests on the future implementation cycles.

Following this approach, there are five phases to the MIKE2.0 Methodology:

Feel free to check them out when you have a moment to learn how they can help improve your enterprise information management program. 

Sincerely,

MIKE2.0 Community

Contribute to MIKE:

Start a new article, help witharticles under construction or look for other ways to contribute.

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
images.jpg

Did You Know?
All content on MIKE2.0 and any contributions you make are published under theCreative Commons license.This allows you free re-use of our content as long as you add a brief reference back to us.

This Week’s Food for Thought:

5 of the Most Common IT Security Mistakes to Watch Out For

Securing the enterprise is no easy task. Every day it seems like there are dozens of new security risks out there, threatening to shut down your company’s systems and steal valuable data. Stories of large corporations suffering from enormous data breaches probably don’t help calm those fears, so it’s important to know the risks are real and businesses must be able to respond to them. Even though enhancing security is crucial, enterprises still make a lot of mistakes while trying to shore up their systems. Here’s a look at some of the most common IT security mistakes so you’ll be better aware of what to watch out for.Read more.

Data Integration is the Schema in Between

The third of the five biggest data myths debunked by Gartner is big data technology will eliminate the need for data integration. The truth is big data technology excels at data acquisition, not data integration. This myth is rooted in what Gartner referred to as the schema on read approach used by big data technology to quickly acquire a variety of data from sources with multiple data formats. This is best exemplified by the Hadoop Distributed File System (HDFS). Unlike the predefined, and therefore predictably structured, data formats required by relational databases, HDFS is schema-less.
Read more.

NoSQL vs SQL: An Overview

With the increase of big data in industries across the world through Hadoop and Hadoop Hive, numerous changes in how big data is stored and analyzed have occurred. It used to be that Structured Query Language (SQL) was the main method companies used to handle data stored in relational database management systems (RDBMS). This technology was first introduced in the 1970’s and was extremely productive for it’s time. However, since 1970, the amount and types of information available has risen and changed dramatically.

Read more.

 

  Forward this message to a friend

 

 

Category: Information Development
No Comments »

by: RickDelgado
17  Oct  2014

How Virtualization Can Improve Security

Virtualization can do a lot for a company. It can increase a business’s efficiency, doing more work with less equipment. Virtualization can also save on costs, particularly when it comes to cooling down servers and getting things back up and running after a technical disaster. That’s just scratching the surface of all the benefits virtualization technology has to offer, so it may come as a surprise that some business leaders are still hesitant to make virtualization a part of their companies. The main concern they have usually has to do with security. Moving sensitive data and programs to virtual machines can sound like a risky strategy, no matter what benefits can be provided. When utilized properly, however, virtualization may actually end up improving security, alleviating any doubts in using the technology.

There are, of course, many ways to implement virtualization in an organization. Some of those ways include server virtualization, network virtualization, storage virtualization, and desktop virtualization. Many companies choose to use one of multiple methods to bring their businesses up to date with all the latest technology, but each type does present challenges when confronting security risks. That’s why there are security solutions for each virtualization strategy. It’s important to note that while virtualization can improve security, it’s does not have the capability to stop all attacks. Threats that appear on physical machines can still pop up from time to time on virtual machines. With that said, here are just a few ways virtualization types can minimize risks and improve security.

 

Server Virtualization

For server virtualization, it becomes even more necessary it provide adequate security. According to one report, more than 90% of records that are stolen by attackers come from servers, and it’s a number that’s only expected to rise over the coming years. Servers that are virtualized have a number of advantages to work with when it comes to security. For one thing, virtualized servers are able to identify and isolate applications that are compromised or unstable. This means that applications that may have been infected with malware are more likely to be identified and separated from the other applications to avoid the spreading of any malicious viruses or damaging elements. In addition to that, virtualized servers can also make it easier to create more cost-effective intrusion detection, protecting not just the server and the virtual machines themselves but the entire network. Virtualization with servers also allows easier monitoring by administrators. By deploying monitoring agents in one virtual location, administrators can more easily view traffic and deny access to suspicious users. Server virtualization also allows a master image of the server to be created, making it easy to determine if the server is acting abnormally or against set parameters.

Network Virtualization

Much of the security advantages that come from network virtualization are similar in nature to those found in server virtualization. One example of this isolation. With network virtualization, virtual networks are separated from others, which greatly minimizes the impact malware could have when infecting the system. The same philosophy applies when looking at another main feature of network virtualization–segmentation, where a virtual network is composed of multiple tiers. The entire network, and in turn each tier, can be protected through the distribution of firewalls. It makes for more effective security measures while employing consistent security models across all networks and software.

 

Desktop Virtualization

Though perhaps not as common as other forms of virtualization, desktop virtualization is still more than capable of making business more productive while still addressing security issues. IT departments are able to better secure virtualized desktops by controlling what users are able to do from a central location. Desktop virtualization also provides for customizing security settings and making changes to meet any new demands. In this way, not only are desktop computer more secure, it makes the IT departments’ job a lot easier.

 

Whether going the desktop, network, or server virtualization route, IT security will always be high on the list of priorities. While at first seen as a potential security liability, virtualization can now be seen as a security enhancement. In the capable hands of the right experts, businesses should be able to prepare the virtualized systems that allow any challenge from a security threat to be met with a rapid and decisive response, thereby keeping valuable company data safe.

 

Category: Information Development
No Comments »

by: Gil Allouche
17  Oct  2014

NoSQL vs. SQL: An Overview

With the increase of big data in industries across the world through Hadoop and Hadoop Hive, numerous changes in how big data is stored and analyzed have occurred. It used to be that Structured Query Language (SQL) was the main method companies used to handle data stored in relational database management systems (RDBMS). This technology was first introduced in the 1970’s and was extremely productive for it’s time. During it’s more than four decades, SQL has proven very efficient in managing structured, predictable data. Using columns and rows with pre selected schemas, an SQL database can then gather and process the data to make it usable and understandable to the end party. It’s proved very effective.

However, since 1970, the amount and types of information available has risen and changed dramatically. The prevalence of big data has drastically increased the amount of information available to companies and it’s changed what type of information is available. Much of the data available today is unstructured and unpredictable, which is very difficult for traditional SQL databases. These changes have put increasing pressure for a system capable of both gathering and analyzing huge amounts of unstructured and unpredictable data.

Not only is it difficult for SQL to process unstructured and unpredictable information, but it’s also more costly. Not only that, but it’s also more difficult to process very large batches of data. SQL isn’t very flexible and or scalable. NoSQL was developed to solve these difficulties and do what SQL couldn’t do. NoSQL is short for “Not Only Structured Query Language” and in the age of big data is making data gathering and processing much easier for companies and businesses.

There are numerous differences to the two. I’ll mention a few of the advantages NoSQL has over SQL here.

 Speed

NoSQL doesn’t require schemas like SQL does meaning it can process information much quicker. With SQL, schemas (another word for categories)had to be predetermined before information was entered. That made dealing with unstructured information extremely difficult because companies never knew just what categories of information they would be dealing with. NoSQL doesn’t require schemas so it can handle unstructured information easier and much quicker. Also, NoSQL can handle and process data in real-time. Something SQL doesn’t do.

 Scalability

Another advantage to NoSQL computing is the scalability it provides. Unlike SQL, which tends to be very costly when trying to scale information and isn’t nearly as flexible, NoSQL makes scaling information a breeze. Not only is it cheaper and easier, but it also promotes increased data gathering. With SQL companies had to be very selective in the information they gathered and how much of it they gathered. That placed restrictions on growth and revenue possibilities. Because of NoSQL’s flexibility and scalability, it promotes data growth. That’s good for businesses and it’s good for the consumer.

 Cloud Computing

NoSQL is also extremely valuable and important for cloud computing. One of the main reasons we’ve seen such a rise in big data’s prominence in the mainstream is because of cloud computing. Cloud computing has drastically reduced the startup costs of big data by eliminating the need of costly infrastructure. That has increased its availability to both big and small business. Cloud computing has also made the entire process of big data, from the gathering stages to analyzing and implementing, easier for companies. Much of the process is now taken care of and monitored by the service providers. The increased availability of big data means that companies can better serve the general public.

So while SQL still has a future and won’t be going away anytime soon, NoSQL is really the key to future success with big data and cloud computing. It’s flexibility, scalability and low cost make it a very attractive option. Additionally it’s ability to gather and analyze unstructured and unpredictable data quickly and efficiently mean it’s a great option for companies with those needs.

Category: Enterprise Data Management
No Comments »

by: Ocdqblog
14  Oct  2014

Data Integration is the Schema in Between

The third of the five biggest data myths debunked by Gartner is big data technology will eliminate the need for data integration. The truth is big data technology excels at data acquisition, not data integration.

This myth is rooted in what Gartner referred to as the schema on read approach used by big data technology to quickly acquire a variety of data from sources with multiple data formats.

This is best exemplified by the Hadoop Distributed File System (HDFS). Unlike the predefined, and therefore predictably structured, data formats required by relational databases, HDFS is schema-less. It just stores data files, and those data files can be in just about any format. Gartner explained that “many people believe this flexibility will enable end users to determine how to interpret any data asset on demand. It will also, they believe, provide data access tailored to individual users.”

While it was a great innovation to make data acquisition schema-less, more work has to be done to develop information because, as Gartner explained, “most information users rely significantly on schema on write scenarios in which data is described, content is prescribed, and there is agreement about the integrity of data and how it relates to the scenarios.”

It has always been true that whenever you acquire data in various formats, it has to be transformed into a common format before it can be further processed and put to use. After schema on read and before schema on write is the schema in between.

Data integration is the schema in between. It always has been. Big data technology has not changed this because, as I have previously blogged, data stored in HDFS is not automatically integrated. And it’s not just Hadoop. Data integration is not a natural by-product of any big data technology, which is one of the reasons why technology is only one aspect of a big data solution.

Just as it has always been, in between data acquisition and data usage there’s a lot that has to happen. Not just data integration, but data quality and data governance too. Big data technology doesn’t magically make any of these things happen. In fact, big data just makes us even more painfully aware there’s no magic behind data management’s curtain, just a lot of hard work.

 

Category: Information Development
No Comments »

by: RickDelgado
08  Oct  2014

5 of the Most Common IT Security Mistakes to Watch Out For

Securing the enterprise is no easy task. Every day it seems like there are dozens of new security risks out there, threatening to shut down your company’s systems and steal valuable data. Stories of large corporations suffering from enormous data breaches probably don’t help calm those fears, so it’s important to know the risks are real and businesses must be able to respond to them. Even though enhancing security is crucial, enterprises still make a lot of mistakes while trying to shore up their systems. Here’s a look at some of the most common IT security mistakes so you’ll be better aware of what to watch out for.

 

Overlooking IT Security

It may sound surprising, but many companies don’t place IT security as one of their top priorities. While in the pursuit of making money, businesses see security as a costly endeavor, one which requires numerous resources, significant investments, and a substantial time commitment. If done right, business would go on as usual, which is why some company leaders don’t consider it high on the to-do list. For obvious reasons, this can be a disastrous approach to take. Too many companies become reactive to threats, dealing with them after they have already occurred. Businesses that take IT security threats seriously need to be much more proactive, learning about the latest risks and taking the necessary steps to prevent them from infecting their systems.

 

Password Weaknesses

One of the first lines of defense preventing data leaks and theft is the password. Passwords make sure only authorized persons are able to access networks and systems. To make this effective, passwords need to be strong, but too often this is simply not the case. Many companies actually use default passwords for their network appliances, making for some attractive targets for prospective attackers. On the flip side, those that change passwords will often use weak ones that are vulnerable. Employees and managers need to make sure their passwords cannot simply be guessed by unauthorized users.

 

Lack of Patching

Security threats are constantly evolving. What was once a major risk several years ago is probably not a major concern today, but that only means other threats have taken its place. The best response companies can have to this evolving landscape is to always patch their IT systems, but this doesn’t happen often enough. One expert from Symantec Corp. says at least 75% of security breaches could be prevented if all the security software were patched with the latest updates. If equipped with patches, security systems will have a far better chance of detecting new threats and responding effectively.

 

Lack of Education

Employee behavior is one of the biggest concerns business leaders have. Even with updated systems and the latest software, security can only be as strong as the weakest link, and many times that weakest link ends up being end-users, or employees. Where businesses often make a mistake is in their failure to educate their employees about threats. Without the proper education about the current risks that are out there, it should come as no surprise that an employee will likely engage in activity that proves risky to company security. Some employees turn into “promiscuous clickers”, clicking on email attachments or links on suspicious and even trusted websites that can lead to malware infection. Employees need to be educated on the risky behaviors they might have so they can work to avoid them in the future. It also doesn’t hurt to place adequate endpoint security controls like anti-virus software and firewalls that can protect from risky clicking.

 

The Unprotected Cloud

Many companies are turning to the cloud to take care of many of their storage and computing needs, but that also opens up more possibilities for security problems. Businesses often don’t check on a cloud vendor’s security capabilities and end up paying for it in the end when data gets lost or stolen. The general rule is, the cheaper the cloud service, this fewer protections it will have. This is especially true for free services, which don’t offer encryption and security measures that the more expensive services do. That’s why businesses will need to make sure they’re doing everything on their end to secure their data while also evaluating cloud vendors.

 

Security needs to be a top priority for businesses, but enhancing IT security often requires avoiding simple mistakes. Though it may require financial and technological resources, companies that make sure their systems are secure can rest easy knowing their data is protected. Some of these mistakes are easy to rectify, and with greater security comes greater confidence and more productivity.

 

Category: Information Development
No Comments »

by: Bsomich
04  Oct  2014

MIKE2.0 Community Update.

 

 
 logo.jpg

Have you seen our Open MIKE Series? 

The Open MIKE Podcast is a video podcast show which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

You can scroll through the Open MIKE Podcast episodes below:

For more information on MIKE2.0 or how to get involved with our online community, please visit www.openmethodology.org.

Sincerely,

MIKE2.0 Community  

Contribute to MIKE:

Start a new article, help witharticles under construction or look for other ways to contribute.

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on

images.jpg

 

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.

 

This Week’s Blogs for Thought:

The Rule of 150 Applied to DataAnthropologist Robin Dunbar has used his research in primates over recent decades to argue that there is a cognitive limit to the number of social relationships that an individual can maintain and hence a natural limit to the breadth of their social group.  In humans, he has proposed that this number is 150, the so-called“Dunbar’s number.”

In the modern organisation, relationships are maintained using data.  It doesn’t matter whether it is the relationship between staff and their customers, tracking vendor contracts, the allocation of products to sales teams or any other of the literally thousands of relationships that exist, they are all recorded centrally and tracked through the data that they throw off.

Read more.

What to Look for When Implementing a BYOD Policy

Businesses everywhere seem to be quickly latching onto the concept of Bring Your Own Device (BYOD) in the workplace. The idea is simple: have employees bring personal devices into work where they can use them for their jobs. For your average business, this seems to be a great way to improve productivity and job satisfaction, but could such a thing work for a hospital? Obviously hospitals are a very different kind of business, where physicians, nurses, and patients interact to get the best care possible. Having personal devices in hand can make the whole operation run much smoother. Many hospitals out there have seen BYOD as a great way to boost productivity and improve patient outcomes. In fact, one survey showed that 85% of hospitals have adopted some form of BYOD. For the few who have not yet made the transition but are looking into it, a number of tips have popped up that could prove helpful.

Read more.

An Open Source Solution for Better Performance Management 

Today, many organizations are facing increased scrutiny and a higher level of overall performance expectation from internal and external stakeholders. Both business and public sector leaders must provide greater external and internal transparancy to their activities, ensure accounting data faces up to compliance challenges, and extract the return and competitive advantage out of their customer, operational and performance information: Managers, investors and regulators have a new perspective on performance and compliance.

Read more.

Forward to a Friend!

Know someone who might be interested in joining the Mike2.0 Community? Forward this message to a friend

Questions?

If you have any questions, please email us at mike2@openmethodology.org.

 

 

Category: Information Development
No Comments »

by: Robert.hillard
27  Sep  2014

The rule of 150 applied to data

Anthropologist Robin Dunbar has used his research in primates over recent decades to argue that there is a cognitive limit to the number of social relationships that an individual can maintain and hence a natural limit to the breadth of their social group.  In humans, he has proposed that this number is 150, the so-called “Dunbar’s number”.

In the modern organisation, relationships are maintained using data.  It doesn’t matter whether it is the relationship between staff and their customers, tracking vendor contracts, the allocation of products to sales teams or any other of the literally thousands of relationships that exist, they are all recorded centrally and tracked through the data that they throw off.

Social structures have evolved over thousands of years using data to deal with the inability of groups of more than 150 to effectively align. One of the best examples of this is the 11th century Doomsday Book ordered by William the Conqueror.  Fast forward to the 21st century and technology has allowed the alignment of businesses and even whole societies in ways that were unimaginable 50 years ago.

Just as a leadership team needs to have a group of people that they relate to that falls within the 150 of Dunbar’s number, they also need to rely on information which allows the management system to extend that span of control.  For the average executive, and ultimately for the average executive leadership team, this means that they can really only keep a handle on 150 “aspects” of their business, reflected in 150 “key data elements”.  These elements anchor data sets that define the organisation.

Key Data Elements

To overcome the constraints of Dunbar’s number, mid-twentieth century conglomerates relied on a hierarchy with delegated management decisions whereas most companies today have heavily centralised decision making which (mostly) delivers a substantial gain in productivity and more efficient allocation of capital.  They can only do this because of the ability to share information efficiently through the introduction of information technology across all layers of the enterprise.

This sharing, though, is dependent on the ability of an executive to remember what data is important.  The same constraint of the human brain to know more than 150 people also applies to the use of that information.  It is reasonable to argue that the information flows have the same constraint as social relationships.

Observing hundreds of organisations over many years, the variety of key data elements is wide but their number is consistently in the range of one to a few hundred.  Perhaps topping out at 500, the majority of well-run organisations have nearer to 150 elements dimensioning their most important data sets.

While decisions are made through metrics, it is the most important key data elements that make up the measures and allow them to be dimensioned.

Although organisations have literally hundreds of thousands of different data elements they record, only a very small number are central to the running of the enterprise.  Arguably, the centre can only keep track of about 150 and use them as a core of managing the business.

Another way of looking at this is that the leadership team (or even the CEO) can really only have 150 close relationships.  If each relationship has one assigned data set or key data element they are responsible for then the overall organisation will have 150.

Choosing the right 150

While most organisations have around 150 key data elements that anchor their most important information, few actually know what they are.  That’s a pity because the choice of 150 tells you a lot about the organisation.  If the 150 don’t encompass the breadth of the enterprise then you can gain insight into what’s really important to the management team.  If there is little to differentiate the key data elements from those that a competitor might choose then the company may lack a clear point of difference and be overly dependent on operational excellence or cost to gain an advantage.

Any information management initiative should start by identifying the 150 most important elements.  If they can’t narrow the set down below a few hundred, they should be suspicious they haven’t gotten to the core of what’s really important to their sponsors.  They should then look to ask the question of whether these key data elements span the  enterprise or pick organisational favourites; whether they offer differentiation or are “me too” and whether they are easy or hard for a competitor to emulate.

The identification of the 150 key data elements provides a powerful foundation for any information and business strategy.  Enabling a discussion on how the organisation is led and managed.  While processes evolve quickly, the information flows persist.  Understanding the 150 allows a strategist to determine whether the business is living up to its strategy or if its strategy needs to be adjusted to reflect the business’s strengths.

Category: Enterprise Data Management, Enterprise2.0, Information Development, Information Governance, Information Management, Information Strategy, Information Value, Metadata
No Comments »

by: RickDelgado
27  Sep  2014

What to Look for When Implementing a BYOD Policy at a Hospital

Businesses everywhere seem to be quickly latching onto the concept of Bring Your Own Device (BYOD) in the workplace. The idea is simple: have employees bring personal devices into work where they can use them for their jobs. For your average business, this seems to be a great way to improve productivity and job satisfaction, but could such a thing work for a hospital? Obviously hospitals are a very different kind of business, where physicians, nurses, and patients interact to get the best care possible. Having personal devices in hand can make the whole operation run much smoother. Many hospitals out there have seen BYOD as a great way to boost productivity and improve patient outcomes. In fact, one survey showed that 85% of hospitals have adopted some form of BYOD. For the few who have not yet made the transition but are looking into it, a number of tips have popped up that could prove helpful.

Keep It Simple (At First)

If the idea is to make physicians’ jobs easier and more convenient, than a BYOD policy is a great way to make that a reality, but it is a pretty big change for a hospital or any other healthcare facility. That’s why when making this change, it’s important to keep things simple to start out. Doctors and staff need time to get accustomed to using their personal devices in their work environment. Often that means downloading helpful applications that can make doing their jobs easier, or getting used to any new security requirements that IT departments define. A simple start allows everyone to ease into the new policy, and any changes that are needed can be done without disrupting the entire program. More complex arrangements can be made later.

Create an Environment of Communication

When dealing with such a drastic change, people will have different ideas about what works and what doesn’t. That’s why everyone involved–from the top hospital administrators to the late night IT shift to the surgical team–will need to be open to new ideas. A doctor may have a unique perspective on how a smartphone or wearable device might be used to help patients. Or an IT worker may see potential in a new app that physicians haven’t thought about using yet. Being able to communicate these ideas can help in the forming of the new BYOD policy. If everyone collaborates together, they will end up with guidelines and suggestions that will keep most people happy, which is what BYOD is all about.

Address Security

One of the biggest concerns over BYOD in any business is that of security. That is certainly true for a hospital as well. When dealing with sensitive patient information, BYOD security must be at the top of any hospital’s priority list. The concerns over security don’t come from thin air. Studies from security companies have shown that around 40% of healthcare employees use personal devices that have no password protection. Many employees also access unsecured networks with their devices. Security can’t be overlooked if a hospital wants to successfully implement a BYOD policy. That means devices will need to be examined to make sure they’re compatible with the hospital’s security network. Some devices may even be rejected. While this limitation may hinder efforts to fully adopt BYOD, any measure taken to ensure tighter security will be worth it in the end.

Focus on Privacy

On that same note, patient privacy must still be taken into account, especially now that so many patient records are now electronic. Of particular concern is making sure all records, devices, and procedures legally comply with the latest HIPAA rules. Hospitals need to work to make sure the data found in their networks and on their employees’ devices is protected. Even though workers may be using their own smartphones or tablets, the data they access on them is still the responsibility of the hospital. Should any of that data get lost or stolen, it will be the hospital that has to deal with subsequent legal proceedings. A BYOD policy that takes all of this into account and plans for the future can saves hospitals a lot of headaches down the line.

Adopting a BYOD program for a hospital is no easy task. Numerous factors need to be taken into consideration to avoid potentially devastating consequences. Luckily, enough hospitals have made the leap to provide helpful lessons on how implementing these policies should be done. With enough expertise and innovation, a hospital creating a BYOD policy for the first time should be able to make that transition with minimal complications.

Category: Information Development
No Comments »

by: Ocdqblog
26  Sep  2014

Is Informed Consent outdated in the Information Age?

Facebook experiment from late 2012 made news earlier this year and raised the ethical question of whether, by using free services provided via the Internet and mobile apps, we have granted informed consent to be experimented on for whatever purposes.

The On the Media TLDR audio podcast recently posted an interview with Christian Rudder, the co-founder of the free dating website OkCupid, who recently blogged about how OkCupid experiments on its users in a post with the intentionally provocative title We Experiment On Human Beings!

While this revelation understandably attracted a lot of attention, at least OkCupid is not trying to hide what it’s doing. Furthermore, as Rudder blogged, “guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”

During the interview, Rudder made an interesting comparison between what websites like Facebook and OkCupid do and how psychologists and other social scientists have been experimenting on human beings for decades. This point resonated with me since I have read a lot of books that explore how and why humans behave and think the way we do. Just a few examples are Predictably Irrational by Dan Ariely, You Are Not So Smart by David McRaney, Nudge by Richard Thaler and Cass Sunstein, and Thinking, Fast and Slow by Daniel Kahneman.

Most of the insights discussed in these books are based on the results of countless experiments, most of which were performed on college students since they can be paid in pizza, course credit, or beer money. The majority of the time the subjects in these experiments are not fully informed about the nature of the experiment. In fact, many times they are intentionally misinformed in order to not skew the results of the experiment.

Rudder argued the same thing is done to improve websites. So why do we see hallowed halls when we envision the social scientists behind university research, but we see creepy cubicles when we envision the data scientists behind website experimentation? Perhaps we trust academic more than commercial applications of science.

During the interview, Rudder addressed the issue of trust. Users of OkCupid are trusting the service to provide them with good matches and Rudder acknowledged how experimenting on users can seem like a violation of that trust. “However,” Rudder argued, “doing experiments to make sure that what we’re recommending is the best job that we could possibly do is upholding that trust, not violating it.”

It’s easy to argue that the issue of informed consent regarding experimentation on a dating or social networking website is not the same as informed consent regarding government surveillance, such as last year’s PRISM scandal. The latter is less about experimentation and more about data privacy, where often we are our own worst enemy.

But who actually reads the terms and conditions for a website or mobile app? If you do not accept the terms and conditions, you can’t use it, so most of us accept them by default without bothering to read them. Technically, this constitutes informed consent, which is why it may simply be an outdated concept in the information age.

The information age needs enforced accountability (aka privacy through accountability), which is less about informed consent and more about holding service providers accountable for what they do with our data. This includes the data resulting from the experiments they perform on us. Transparency is an essential aspect of that accountability, allowing us to make an informed decision about what websites and mobile apps we want to use.

However, to Rudder’s point, we are fooling ourselves if we think that such transparency would allow us to avoid using the websites and mobile apps that experiment on us. They all do. They have to in order to be worth using.

 

Category: Information Governance
No Comments »

by: Bsomich
20  Sep  2014

Community Update.

Missed what’s been happening in the MIKE2.0 community? Check out our bi-weekly update:

 

Click to view this email in a browser

 logo.jpg

When to Use a NoSQL Database 

If your company has complicated, large sets of data that it’s looking to analyze, and that data isn’t simple, structured or predictable data then SQL is not going to meet your needs. While SQL specializes in many things, large amounts of unstructured data is not one of those areas. There are other methods for gathering and analyzing your data that will be much more effective and efficient and probably cost you less too.

NoSQL, which stands for “Not Only Standard Query Language,” is what your company needs. It’s perfect for dealing with huge data sets of information that aren’t always going to be structured — meaning every batch that’s gathered and analyzed could potentially be different. It’s a relatively new process, but it’s becoming increasingly important in the big data world because of its immense capabilities.

NoSQL is known for it’s flexibility, scalability and relatively low cost, all of which SQL doesn’t offer. SQL relies on traditional column and row processing with predefined schemas. When you’re dealing with unstructured information, it’s nearly impossible to work with schemas, because there’s no way to know what type of information you’ll be getting each time. Also, with SQL it can be costly to scale data and the actual processing can be very slow. NoSQL on the other hand, makes the process cheaper, and it can also do real-time analyzing with extremely large data sets. It gives companies the flexibility to gather as much or as little information as it needs to be successful.

We invite you to read more about this topic on our blog. And as always, we welcome any suggestions you may have to improve it.

Sincerely,

MIKE2.0 Community

New! Popular Content

Did you know that the followingwiki articles are most popular on Google? Check them out, and feel free to edit or expand them!

What is MIKE2.0?
Deliverable Templates
The 5 Phases of MIKE2.0
Overall Task List
Business Assessment Blueprint
SAFE Architecture
Information Governance Solution

Contribute to MIKE:

Start a new article, help witharticles under construction or look for other ways to contribute.

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
 images.jpg

 

 

This Week’s Blogs for Thought:

What Movie Descriptions Can Teach Us About Metadata

In a previous post on this blog, I reviewed what movie ratings teach us about data quality. In this post I ponder another movie-related metaphor for information development by looking at what movie descriptions teach us about metadata.

Read more.

How CIOs Can Discuss the Contribution of IT

Just how productive are Chief Information Officers or the technology that they manage?  With technology portfolios becoming increasingly complex it is harder than ever to measure productivity.  Yet boards and investors want to know that the capital they have tied-up in the information technology of the enterprise is achieving the best possible return.
For CIOs, talking about value improves the conversation with executive colleagues.  Taking them aside to talk about the success of a project is, even for the most strategic initiatives, usually seen as a tactical discussion.  Changing the topic to increasing customer value or staff productivity through a return on technology capital is a much more strategic contribution.

Read more.
Let the Computers Calculate and the Humans Cogitate

Many organizations are wrapping their enterprise brain around the challenges of business intelligence, looking for the best ways to analyze, present, and deliver information to business users.  More organizations are choosing to do so by pushing business decisions down in order to build a bottom-up foundation.

However, one question coming up more frequently in the era of big data is what should be the division of labor between computers and humans?
Read more.

 

 

Forward to a Friend!

Know someone who might be interested in joining the Mike2.0 Community? Forward this message to a friendQuestions?

If you have any questions, please email us at mike2@openmethodology.org.

 

 

Category: Information Development
No Comments »

Calendar
Collapse Expand Close
TODAY: Mon, October 20, 2014
October2014
SMTWTFS
2829301234
567891011
12131415161718
19202122232425
2627282930311
Recent Comments
Collapse Expand Close