Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.
by: Bsomich
25  Aug  2015

Where in MIKE2.0 Should Data Protection and Privacy Live?

With the ever increasing demand for compliance with outside regulators and statutes, should MIKE2.0 look to expand data protection and privacy as a solution offering?

This question was posed by a community member in our forum this week with the context below:

While Information Security as a solution may cover some aspects, the wider requirements will need to involve the same information and data. In the UK, a subject access request means any data / information in structured, semi structured or unstructured form about an individual can be demanded. Similarly, Freedom of Information requests need analysis and assignment. These need a cross functional management system that deals with the same assets and resources as covered by the rest of information management.

It is a day-to-day, cross functional management activity that needs responsibility assigned. Accountability can tie in with the governance part of MIKE 2.0., but does it need added to the solutions? And if so, where?

We’d love your feedback on this potential offering in the comments section below.

Category: Information Development
No Comments »

by: Jonathan
25  Aug  2015

How Big Data is Transforming Airports

South of Iran, east of Saudi Arabia, and north of Oman is Dubai, an emirate (political territory) of the U.A.E. (the United Arab Emirates). In addition to being the location of Burj Khalifa (“Khalifa Tower” in English), the current tallest building in the world, Dubai also hosts an international airport unlike any other. The Dubai International Airport (DXB) holds the record for the world’s busiest airport. This “mega-airport” expects to serve a staggering 120 million customers this year. Compare that to the measly 94 million passengers that the Hartsfield-Jackson Atlanta International Airport in Georgia handled in 2013, and the 72 million that passed through London-Heathrow Airport in the United Kingdom in 2014.

Any traveler who has missed a connecting flight because the gates were too far apart, or ended up standing in the wrong line because either the writing on the boarding pass or the announcements over the intercom were in a different language has to wonder how an airport of any size could handle 120 million people — successfully.

When asked about the likelihood of issues such as people getting lost and luggage being left behind, Dubai Airports CEO Paul Griffiths explained at the ATIS (Air Transport Industry Summit) that the efficient and intelligent analysis of real-time big data will keep the airport running as secure as a Boeing 717. “We keep increasing the size of the pipe but actually what our passengers want is to spend less time going through (the) process,” Griffiths said in an interview with Gulf Business. “This is where technology comes to the fore with more efficient operations. It’s about the quality of the personalized customer experience where people don’t have to walk more than 500 meters. That is the design goal and technology is central to that.”

 The big data that Griffiths referred to is a massive collection of information about distances between airport gates, baggage handling efficiency, and flight durations among other statistics. All of this information interpreted by “intelligent systems” will transform DBX and airports like it in three ways:

 1.    Increased Efficiency. The Dubai International Airport is an exception to the rule that larger airports are more difficult to traverse. Real-time calculations will allow the air traffic control tower to guide airplanes to terminals close to the connecting flights each passenger requires.

 2.    Improved Customer Experience. Everything’s getting smarter, including the boarding passes. Instead of printing everything only in Arabic and English, the analysis of big data information such as a person’s native language will result in better, readable, personalized boarding passes tailored to each individual.

 3.    Cost Reduction. An increase in customers, plus increased efficiency, plus an improved customer experience means that Dubai’s profits will soar. When the statistics are examined a year from now undoubtedly the money saved by not having to reroute passengers and pay for missed flights and hotel stays will be the final proof that big data analytics tools are transformative.

“I believe that technology will take center stage in the future of aviation,” Griffiths said. “Airports, for too long, have been considered just infrastructure businesses. Actually, we have a vital role to play in enabling a level of customer service that certain airlines have already got right in the air but some airports have let them down with on the ground.”

 

Category: Business Intelligence
No Comments »

by: RickDelgado
17  Aug  2015

Differences between Large and Small Companies Using BYOD

Regardless of company size, Bring-Your-Own-Device (BYOD) has become quite popular. According to Gartner, half of employers surveyed say they’re going to require workers to supply their own devices at work by 2017. Spiceworks did a similar study, finding about 61% of small to medium sized businesses were using a BYOD policy for employee devices. Businesses of all sizes are taking BYOD seriously, but are there differences in how large and small companies handle their policies?

 Gaining experience is important in learning how to implement and manage a mobile device policy. Small companies are increasingly supporting smartphones and tablets. Companies with fewer than 20 employees are leading – Spiceworks says 69% in a survey are supportive. By comparison, 16% of employers with more than 250 employees were as enthusiastic.

 According to this study, small companies appear to be more flexible in adopting BYOD. There are certain aspects, however, where they may lag behind their larger counterparts. Here are some examples.

 Mobile Device Management

 Larger corporations often have more resources available to implement Mobile Device Management (MDM) systems. For example, Spiceworks said 56% of respondents were not planning to use MDM mainly because the company does not see a big enough threat. Lost or stolen devices, or misuse by employees, are seen as substantial risks. On the other hand, 17% of the responding small businesses were engaging in active management and just 20% said they would within six months.

 The perks of MDM include barriers against data theft, intrusion, and unauthorized use and access. It also helps prevent malware infections.

 Larger businesses seem to be more understanding of the need for a proactive MDM system. They tend to possess more knowledge of the technology and the risks and face fewer budgetary hurdles. By comparison, many small companies lack knowledge, funds, and insight into the risks of connecting mobile devices to their network. Cloud-based MDM solutions are a growing alternative. The same Spiceworks study found 53% of respondents were going with a hosted device management solution.

Security

The risks are clearly great for any sized company. A BYOD policy can boost revenue and risk management into the millions of dollars. Corporations usually have multiple layers of security. For a small business, it doesn’t take much to bring the company down. One single cyber-attack can be so costly the company won’t be able to survive.

 Security, and the training that goes along with it, is costly for a small company. It might not be able to afford any of the tools necessary for adequate protection. Even if a company was going for savings, data breaches will make these seem like pennies. Such events can cause millions of dollars in damages for even the smallest businesses.

Data leakage is another security risk, besides cost. Mobile devices are prone to data theft without a good MDM system. Gartner highlights the fact mobile devices are designed to support data sharing, but lack file systems for applications. This makes it easier for data to be duplicated and sent to applications in the cloud. It is up to IT to be up on the latest technologies and uses. Obviously, larger companies have the upper hand in this area as they have a better security posture.

 Conclusion

Both large and small companies are using BYOD. The differences lie in the willingness to adopt comprehensive Mobile Device Management systems and security policies. These come with the obvious costs which smaller businesses must wrestle with. It often comes down to comparing the daily policy operating costs with those of the risks. When a breach happens, for example, a small business feels the pain and wishes having had the right system in place. Cloud MDM systems are becoming more affordable. These are providing smaller entities with the resources of larger organizations. Time will only tell whether small and medium sized business will become as accepting of mobile device security and management as larger organizations.

Category: Information Governance
No Comments »

by: Bsomich
30  Jul  2015

MIKE2.0 Community Update

Missed what’s been happening in the MIKE2.0 data management community this month? Read on!

 

 logo.jpg

Data Governance: How competent is your organization?

One of the key concepts of the MIKE2.0 Methodology is that of an Organisational Model for Information Development. This is an organisation that provides a dedicated competency for improving how information is accessed, shared, stored and integrated across the environment.

Organisational models need to be adapted as the organisation moves up the 5 Maturity Levels for organisations in relation to the Information Development competencies below:

Level 1 Data Governance Organisation – Aware 

  • An Aware Data Governance Organisation knows that the organisation has issues around Data Governance but is doing little to respond to these issues. Awareness has typically come as the result of some major issues that have occurred that have been Data Governance-related. An organisation may also be at the Aware state if they are going through the process of moving to state where they can effectively address issues, but are only in the early stages of the programme.
Level 2 Data Governance Organisation – Reactive
  • Reactive Data Governance Organisation is able to address some of its issues, but not until some time after they have occurred. The organisation is not able to address root causes or predict when they are likely to occur. “Heroes” are often needed to address complex data quality issues and the impact of fixes done on a system-by-system level are often poorly understood.
Level 3 Data Governance Organisation – Proactive
  • Proactive Data Governance Organisation can stop issues before they occur as they are empowered to address root cause problems. At this level, the organisation also conducts ongoing monitoring of data quality to issues that do occur can be resolved quickly.
Level 4 Data Governance Organisation – Managed
Level 5 Data Governance Organisation – Optimal

The MIKE2.0 Solution for the the Centre of Excellence provides an overall approach to improving Data Governance through a Centre of Excellence delivery model for Infrastructure Development and Information Development. We recommend this approach as the most efficient and effective model for building these common set of capabilities across the enterprise environment.

Feel free to check it out when you have a moment and offer any suggestions you may have to improve it.

Sincerely,

MIKE2.0 Community

Contribute to MIKE:

Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
 images.jpg

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.

 

This Week’s Blogs for Thought:

Anxious About BYOD? Here are Some Tips for Success

Has your organization caved to the pressure of establishing a Bring Your Own Device (BYOD) policy and is now having second thoughts? Making company-wide policy changes and satisfying tech-savvy employees’ desires is just the beginning. Once BYOD is up and running, there are many challenges. The difference between success and failure means addressing key concerns and finding ways to overcome these issues.

Read more.

Are We Missing the Mark with Real-Time Marketing?

If any press is good press, then Totinos can chalk up its latest Super Bowl marketing antics for a win. However, it’s questionable whether the brand will gain any true business value from live-tweeting the game a day early. In fact, marketers should step back and consider whether our obsession with vanity metrics and viral campaigns is distracting us from the true potential of real-time and data-driven marketing.

Read more.

Don’t seek to know everything about your customer

I hate customer service surveys. Hotels and retailers spend millions trying to speed our checkout or purchase by helping us avoid having to wait around. Then they undo all of that good work by pestering us with customer service surveys which take longer than any queue that they’ve worked so hard to remove!
Perhaps I’d be less grumpy if all of the data that organisations spend so much time, much of it ours, collecting was actually applied in a way that provided tangible value. The reality is that most customer data simply goes to waste (I argue this in terms of “decision entropy” in chapter 6 of my book, Information-Driven Business).

Read more.

 

Forward to a Friend!

Know someone who might be interested in joining the Mike2.0 Community? Forward this message to a friend

Questions?

If you have any questions, please email us at mike2@openmethodology.org

 

 

Category: Information Development
No Comments »

by: RickDelgado
30  Jul  2015

Enterprise Storage: Keeping Up with the Data Deluge

The increasing demands found in data centers can be difficult for most people to keep up with. We now live in a world where data is being generated at an astounding pace, which has lead to expert coining the phrase “big data.” All that generated data is also being collected, which creates even bigger demands for enterprise data storage. Consider all the different trends currently going around, from video and music streaming to the rise of business applications to detailed financial information and even visual medical records. It’s no wonder that storage demands have risen around 50 percent annually in the past few years, and there appears to be nothing on the horizon that will slow that growth. Companies have reason for concern as current data demands threaten to stretch their enterprise storage to its breaking point, but IT departments aren’t helpless in this struggle. This data deluge can be managed; all that’s needed are the right strategies and technologies to handle it.

It isn’t just the fact that so much new data needs to be stored, it’s that all the data should be stored securely while still allowing authorized personnel to access it efficiently. Combine that with the rapidly changing business environment where needs can evolve almost on a daily basis and the demands for an agile and secure enterprise storage system can overwhelm organizations. The trick is to construct infrastructure that can manage these demands. A well designed storage network can relieve many of the headaches that are generated when dealing with large amounts of data, but such a network requires added infrastructure support.

Luckily, IT departments have many options they can choose from that can meet the demands of the data deluge. One of the most popular at the moment is storage virtualization. This technology basically works by combining multiple network storage devices so that they appear to be only one unit. The components for a virtualized storage system, however, can be a tough decision for companies to make. Network attached storage (NAS), for instance, helps people within an organization access the same data at the same time. Storage area networks (SAN) help make planning and implementing storage solutions much easier. Both carry certain advantages over the more traditional direct-attach storage (DAS) deployments seen in many businesses. DAS simply comes with too many risks and downsides, making it a poor choice when confronting the current data challenges many companies face. Whether choosing NAS or SAN, both can simplify storage administration, an absolute must when storage management has become so complex. They also reduce the amount of hardware needed thanks to converged infrastructure technology.

But these strategies aren’t the only one companies can use to keep up with enterprise storage demands. Certain administrative tactics can be deployed to handle the growing volume and complexity of the current storage scene. Part of that strategy is avoiding certain mistakes, such as storing non-critical data on costly storage devices. There’s also the problem of storing too much. In some cases, business leaders ask IT workers to store multiple copies of information, even when the multiple copies aren’t needed. IT departments need to work closely with the business side of the company to devise the right strategy to avoid these unnecessary complications. By streamlining the process, it can become easier to manage storage.

Other options are also readily available to meet enterprise storage demands. Cloud storage, for example, has quickly become mainstream and comes with attractive advantages, such as easy scalability when businesses need it and the ability to access data from almost anywhere. Concerns over data security have made some businesses reluctant to adopt the cloud, but many cloud storage vendors are trying to address those worries with greater emphasis on security features. Hybrid storage solutions are also taking off in popularity in part because they mix many of the advantages found in other storage options.

With the demands large amounts of data are placing on enterprise storage, IT departments are searching for the answers that can help them keep up with these challenges. The options are there that help meet these demands, but it’s up to companies to fully deploy those solutions. Data continues to be generated at a breakneck pace, and that trend won’t be slowing down anytime soon. It’s up to organizations to have the right strategies and technology in place to take full advantage of this ongoing data deluge.

 

Category: Enterprise Data Management
No Comments »

by: Robert.hillard
25  Jul  2015

Don’t seek to know everything about your customer

I hate customer service surveys. Hotels and retailers spend millions trying to speed our checkout or purchase by helping us avoid having to wait around. Then they undo all of that good work by pestering us with customer service surveys which take longer than any queue that they’ve worked so hard to remove!

Perhaps I’d be less grumpy if all of the data that organisations spend so much time, much of it ours, collecting was actually applied in a way that provided tangible value. The reality is that most customer data simply goes to waste (I argue this in terms of “decision entropy” in chapter 6 of my book, Information-Driven Business).

Customer data is expensive

Many years ago, I interviewed a large bank about their data warehouse. It was the 1990s and the era of large databases was just starting to arrive. The bank had achieved an impressive feat of engineering by building a huge repository of customer data, although they admitted it had cost a phenomenal sum of money to build.

The project was a huge technical success overcoming so many of the performance hurdles that plagued large databases of the time. It was only in the last few minutes of the interview that the real issue started to emerge. The data warehouse investment was in vain, the products that they were passionate about taking to their customers were deliberately generic and there was little room for customisation. Intimate customer data was of little use in such an environment.

Customer data can be really useful but it comes at a cost. There is the huge expense of maintaining the data and there is the good will that you draw upon in order to collect it. Perhaps most importantly, processes to identify a customer and manage the relationship add friction to almost every transaction.

Imagine that you own a clothing or electrical goods store. From your vantage point behind the counter you see a customer run up to you with cash in one hand and a product in the other. They look like they’re in a hurry and thrust the cash at you. Do you a) take the cash and thank them; or b) ask them to stop before they pay and register for your loyalty programme often including a username and password? It’s obvious you should go option a, yet so many retailers go with option b. At least the online businesses have the excuse that they can’t see the look of urgency and frustration in their customers’ eyes, it is impossible to fathom why so many bricks-and-mortar stores make the same mistake!

Commoditised relationships aren’t bad

Many people argue that Apple stores are close to best practice when it comes to retail, yet for most of the customer interaction the store staff member doesn’t know anything about the individual’s identity. It is not until the point of purchase that they actually access any purchase history. The lesson is that if the service is commoditised it is better to avoid cluttering the process with extraneous information.

Arguably the success of discount air travel has been the standardisation of the experience. Those who spend much of their lives emulating the movie Up in the Air want to be recognised. For the rest of the population, who just want to get to their destination at the lowest price possible while keeping a small amount of comfort and staying safe, a commoditised service is ideal. Given the product is not customised there is little need to know much about the individual customers. Aggregate data for demand forecasting can often be gained in more efficient ways including third party sources.

Do more with less

Online and in person, organisations are collecting more data than ever about their customers. Many of these organisations would do better to focus on a few items of data and build true relationships by understanding everything they can from these small number of key data elements. I’ve previously argued for the use of a magic 150 or “Dunbar’s number” (see The rule of 150 applied to data). If they did this, not only would they be more effective in their use of their data, they could also be more transparent about what data they collect and the purposes to which they put it.

People increasingly have a view of the value of their information and they often end-up resenting its misuse. Perhaps, the only thing worse than misusing it is not using it at all. There is so much information that is collected that then causes resentment when the customer doesn’t get the obvious benefit that should have been derived. Nothing frustrates people more than having to tell their providers things that are obvious from the information that they have already been asked for, such their interests, family relationships or location.

Organisations that don’t heed this will face a backlash as people seek to regain control of their own information (see You should own your own data).

Customers value simplicity

In this age of complexity, customers are often willing to trade convenience for simplicity. Many people are perfectly happy to be a guest at the sites they use infrequently, even though they have to re-enter their details each time, rather than having to remember yet another login. They like these relationships to be cheerfully transactional and want their service providers to respect them regardless.

The future is not just more data, it is more tailored data with less creepy insight and a greater focus on a few meaningful relationships.

Category: Enterprise Data Management, Enterprise2.0, Information Development, Master Data Management
No Comments »

by: RickDelgado
21  Jul  2015

Anxious About BYOD? Here are Some Tips for Success

Has your organization caved to the pressure of establishing a Bring Your Own Device (BYOD) policy and is now having second thoughts? Making company-wide policy changes and satisfying tech-savvy employees’ desires is just the beginning. Once BYOD is up and running, there are many challenges. The difference between success and failure means addressing key concerns and finding ways to overcome these issues.

 Mobile Device Management

 Security is undoubtedly the most pressing concern with BYOD. Even with a sound policy, the rapidly shifting security landscape is a challenge. The constant updating of devices is, too. You must constantly adapt your threat defenses and corporate policies. Mobile Device Management (MDM) provides many benefits, including a centralized view of data stored on devices. There are many cases of unhappy employees misusing sensitive information or hackers accessing vulnerable mobile networks. The safest approach is when administrators can see the first signs of a breach and take action.

An MDM system provides access control and monitoring of corporate data. Information on a stolen or lost device can be immediately erased. Mobile apps have caused challenges of their own. Many of them collect personal data and store them in the cloud. An important feature to look for is Mobile Application Management, which keeps track of all the apps on your mobile network and even blocks ones known to be particularly risky.

 Vendor Managed Services

 Not every company employs the most needed talent. A cost-effective way to offset this imbalance is to pursue vendor managed services. Consulting organizations have emerged in the mobile era and employ the technology, tools, and methods to efficiently manage data. DataXoom, a mobile virtual network operator, provides MDM, asset management, and even assistance with procuring the best hardware and software. The ultimate goal is to manage the financial cost of bring your own device and managing data on and accessed by mobile devices.

Stay Compliant

Compliance with the latest standards is essential for keeping BYOD in your company. The Payment Card Industry Data Security Standard 3.0 is one you should be following. It provides guidelines and testing procedures related to building a secure network, protecting cardholder data, and implementing effective access control. Also covered are monitoring and testing and maintaining an information security policy that includes all devices, systems, and personnel. The PCI DSS 3.0 standard is also a guideline for internal and external auditors.

Fine Tune Your Policy

A BYOD policy isn’t static. It needs to adapt to changing security risks and company requirements. For the policy to work, you need to identify what devices are permitted on the network, and control information access down to the individual device. Administrators also need to think about password complexity, screen locking, and other security measures.

Other elements of your policy should outline how technical support operates. Also include permitted apps and rules for acceptable websites, materials, and how all of these are monitored. In addition to governing usage, your leaders should also have a plan for what happens when an employee leaves the company. Do they return the phone or do you just remove access to email, company apps, and data?

Some organizations have resorted to a Choose Your Own Device (CYOD) policy. Users are issued corporate owned devices. They may or may not have a pick from approved products. This gives the company more control over compliance and security, while it pays all costs related to the device.

What about Privacy?

Today’s employees have been outspoken about their rights to have personal data on the same device as their work. The challenge is businesses must protect their mobile networks against unauthorized use. Employer access rules have drawn controversy amongst IT policy drafters. While work-related data could be subject to legal investigations down the road, personal information would be exposed as well. The level of control over personal data has been less than ideal for many workers. Yet, privacy matters still need to be addressed.

Conclusion

These are just a few of the main issues regarding corporate BYOD. Implementing the policy takes work, but continual monitoring and adjustments are required for a successful mobile device policy. That means your company and stakeholders must adjust to change. Security challenges, compliance requirements, employee sentiment, and the devices themselves will certainly be in flux in the years to come.

Category: Enterprise Data Management
No Comments »

by: Jonathan
09  Jul  2015

Are We Missing the Mark with Real-Time Marketing?

If any press is good press, then Totinos can chalk up its latest Super Bowl marketing antics for a win. However, it’s questionable whether the brand will gain any true business value from live-tweeting the game a day early. In fact, marketers should step back and consider whether our obsession with vanity metrics and viral campaigns is distracting us from the true potential of real-time and data-driven marketing.

 

Mainstream real-time marketing

 

In short, real-time marketing normally referred to as the practice of engaging audiences with content that is relevant to a specific current event. For most brands, this content often takes the shape of “memes” shared through social media channels.

 

While Totinos’ day-early tweets were revealed to be a gimmick, initially many thought the company had made a significant real-time gaffe. Pre-written tweets in an attempt to be clever reflect organizations’ desire to streamline their marketing using a pre-determined formula. Better brands understand that real-time marketing has to be organic with an understanding of the target-market. Oreo’s real-time tweet during the power outage of Super Bowl XLVII was held up as a genius example of real-time marketing. While brand engagement certainly has its place, true real-time marketing that has a long-term impact on ROI is much less sexy than a clever tweet in front of a large audience.

 

Where real-time marketing started

 

While many might associate real-time marketing with the rise of big data analytics and social media, the term rose to popularity well before social media marketing and data collection took off. In fact, the term first surfaced back in 2005. Back then it wasn’t about “memes” and on-the-spot tweets, but instead web personalization.

 

Initially, big brands wanted to find ways to personalize their website experiences in real-time. However, the technology and software weren’t at that level, and any solutions were often expensive and not all that great. This eventually led to customized email marketing approaches and other methods, while web personalization was put on the backburner. Fast forward to today, that element has all but expired with more effort being placed on social media.

 

Missing the mark and taking the easy road

 

This is precisely where most organizations are missing the mark. Sure, Oreo’s Super Bowl tweet was amazing and produced a tremendous amount of engagement, but as mentioned earlier, real-time marketing isn’t designed for engagement. It’s meant for finding ways to create substantial long-term impacts on ROI. But that’s hard, just as it was in 2005 with web personalization. People would rather take the easy way out and point to massive amounts of social impressions instead of using data and real-time analysis to produce more value in other areas. That needs to change.

 

Where can real-time marketing be implemented

 

There are a number of different marketing approaches that stand to benefit from a real-time approach. Here are just a few examples to get your creative juices flowing.

 

  1. Customized landing pages

 

What was difficult back in 2005 is becoming a lot easier today. E-commerce sites have made the most of this, by allowing users to create personal profiles, and then offering items based on searches in real-time. This may be more difficult for other sites, but not impossible. Creating personalized landing pages based on devices used or user preferences is becoming increasingly common. Real-time abilities allow programs to make these changes on the fly, reacting to clicks and searches almost instantly.

 

  1. Location-based marketing

 

Thanks to mobile technology, primarily smartphones, and their built-in location services, marketers have the ability to tailor messages based on area like never before. For example, if users are near Wal Mart or Target, promotions could be pushed to their devices via notifications. Marketers can also use the technology to see where users shop most often, or use in-store beacons to attract shoppers. By utilizing real-time capabilities, marketers can craft individualized offers and have them activated at the right moment, when users are likely to act.

 

  1. Multi-channel marketing

 

The path to making a purchase is becoming increasingly more complex. In times past, it simply involved a trip to the local store or maybe a catalogue. Today, it often includes visiting websites, browsing social media accounts, and viewing mobile sites. In order to meet the demands, brands are forced to customized approaches for each of their channels, allowing tailored marketing efforts for each channel, while still maintaining a seamless approach as users jump from one to the other. Real-time analytics can provide organizations with constant details about which channels they’re using, and what they’ll respond best to in order to increase conversion rates.

 

Category: Information Development
No Comments »

by: Bsomich
28  Jun  2015

MIKE2.0 Community Update

Interested in better data management? See what’s been happening in the MIKE2.0 community this month:

 

  
 logo.jpg

Have you seen our Open MIKE Series? 

The Open MIKE Podcast is a video podcast show which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki Articles, Blog Posts, and Discussion Forums.

You can scroll through the Open MIKE Podcast episodes below:

For more information on MIKE2.0 or how to get involved with our online community, please visit www.openmethodology.org.

Sincerely,

MIKE2.0 Community  

Contribute to MIKE:Start a new article, help with articles under construction or look for other ways to contribute

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
 images.jpg

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.

 

This Week’s Blogs for Thought:

Key Considerations for a Big Data StrategyMost companies by now understand the inherent value found in big data. With more information at their fingertips, they can make better decisions regarding their businesses. That’s what makes the collection and analysis of big data so important today. Any company that doesn’t see the advantages that big data brings may quickly find themselves falling behind their competitors.

Read more.

The Internet was a mistake. Now let’s fix it. 

Each generation over the last century has seen new technologies that become so embedded in their lives that its absence would be unimaginable.  Early in the 20th century it was radio, which quickly become the entertainment of choice, then television, video and over the past two decades it has been the Internet. For the generation who straddles the implementation of each, there have been format and governance debates which are quickly forgotten.  Today, few remember the colour television format choice every country made between NTSC and PAL just as anyone who bought a video recorder in the early 1980s had to choose between VHS and Beta.

Read more.

Flash Quiety Taking Over Disk in a Big Data World

Right now, we live in the big data era. What was once looked at as a future trend is now very much our present reality. Businesses and organizations of all shapes and sizes have embraced big data as a way to improve their operations and find solutions to longstanding problems. It’s almost impossible to overstate just how much big data has impacted the world in such a short amount of time, affecting everyone’s life whether we truly comprehend how.

Read more.

Forward to a Friend!Know someone who might be interested in joining the Mike2.0 Community? Forward this message to a friend

Questions?

If you have any questions, please email us at mike2@openmethodology.org

 

 

Category: Information Development
No Comments »

by: Robert.hillard
20  Jun  2015

The Internet was a mistake, now let’s fix it

Each generation over the last century has seen new technologies that become so embedded in their lives that its absence would be unimaginable.  Early in the 20th century it was radio, which quickly become the entertainment of choice, then television, video and over the past two decades it has been the Internet.

For the generation who straddles the implementation of each, there have been format and governance debates which are quickly forgotten.  Today, few remember the colour television format choice every country made between NTSC and PAL just as anyone who bought a video recorder in the early 1980s had to choose between VHS and Beta.

It is ironic that arguably the biggest of these technologies, the Internet, has been the subject of the least debate on the approach to governance, standards and implementation technology.

Just imagine a world where the Internet hadn’t evolved in the way it did.  Arguably the connectivity that underpins the Internet was inevitable.  However, the decision to arbitrarily open-up an academic network to commercial applications undermined well progressed private sector offerings such as AOL and Microsoft’s MSN.

That decision changed everything and I think it was a mistake.

While the private sector offerings were fragmented, they were well governed and with responsible owners.

Early proponents of the Internet dreamed of a virtual world free of any government constraints.  Perhaps they were influenced by the end of the Cold War.  Perhaps they were idealists.  Either way, the dream of a virtual utopia has turned into an online nightmare which every parent knows isn’t safe for their children.

Free or unregulated?

The perception that the Internet is somehow free, in a way that traditional communications and sources of information are not, is misguided.

Librarians have long had extraordinary codes of conduct to protect the identity of borrowers from government eyes.  Compare that to the obligation in many countries to track metadata and the access that police, security agencies and courts have to the online search history of suspects.

Telephone networks have always been open to tapping, but the closed nature of the architecture meant that those points are governed and largely under the supervision of governments and courts.  Compare that to the Internet which does theoretically allow individuals to communicate confidentially with little chance of interception but only if you are one of the privileged few with adequate technical skill.  The majority of people, though, have to just assume that every communication, voice, text or video is open to intercept.

Time for regulation

We need government in the real world and we should look for it on the Internet.

The fact that it is dangerous to connect devices directly to the internet without firewalls and virus protection is a failure of every one of us who is involved in the technology profession.  The impact of the unregulated Internet on our children and the most vulnerable in our society reflects poorly on our whole generation.

It is time for the Internet to be properly regulated.  There is just too much risk and (poor) regulation is being put in place by stealth anyway.  Proper regulation and security would add a layer of protection for all users.  It wouldn’t remove all risk, but even the humble telephone has long been used as a vehicle for scams, however remedies have been easier to achieve and law enforcement more structured.

The ideal of the Internet as a vehicle of free expression need not be lost and in fact can be enhanced by ethically motivated governance with the principal of free speech at its core.

Net neutrality is a myth

Increasing the argument for regulation is the reality of the technology behind the Internet.  Most users assume the Internet is a genuinely flat virtual universe where everyone is equal.  In reality, the technology of the Internet is nowhere near the hyperbole.  Net neutrality is a myth and we are still very dependent on what the Internet Service Providers (ISPs) or telecommunications companies do from an architecture perspective (see The architecture after cloud).

Because the Internet is not neutral, there are winners and losers just as there are in the real world.  The lack of regulation means that they come up with their own deals and it is simply too complicated for consumers to work out what it all means for them.

Regulation can solve the big issues

The absence of real government regulation of the Internet is resulting in a “Wild West” and an almost vigilante response.  There is every probability that current encryption techniques will be cracked in years to come, making it dangerous to transmit information that could be embarrassing in the future.  This is leading to investment in approaches such as quantum cryptography.

In fact, with government regulation and support, mathematically secure communication is eminently possible.  Crypto theory says that a truly random key that is as long as the message being sent cannot be broken without a copy of the key.  Imagine a world where telecommunication providers working under appropriate regulations issued physical media similar to passports containing sufficient random digital keys to transmit all of the sensitive information a household would share in a year or even a decade.

We would effectively be returning to the model of traditional phone services where telecommunication companies managed the confidentiality of the transmission and government agencies could tap the conversations with appropriate (and properly regulated) court supervision.

Similarly, we would be mirroring the existing television and film model of rating all content on the Internet allowing us to choose what we want to bring into our homes and offices.  Volume is no challenge with an army of volunteers out there to help regulators.

Any jurisdiction can start

Proper regulation of the internet does not need to wait for international consensus.  Any one country can kick things off with almost immediate benefit.  As soon as sufficient content is brought into line, residents of that country will show more trust towards local providers which will naturally keep a larger share of commerce within their domestic economy.

If it is a moderately large economy then the lure of frictionless access to these consumers will encourage international content providers to also fall into line given the cost of compliance is likely to be negligible.  As soon as that happens, international consumers will see the advantage of using this country’s standards as a proxy for trust.

Very quickly it is also likely that formal regulation in one country will be leveraged by governments in others.  The first mover might even create a home-grown industry of regulation as well as supporting processes and technology for export!

Category: Information Governance
No Comments »

Calendar
Collapse Expand Close
TODAY: Sun, March 26, 2017
March2017
SMTWTFS
2627281234
567891011
12131415161718
19202122232425
2627282930311
Archives
Collapse Expand Close
Recent Comments
Collapse Expand Close