Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Members
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.
by: Bsomich
30  Aug  2014

MIKE2.0 Community Update

Missed what’s been happening in the MIKE2.0 Community? Check out our bi-weekly update:

Click to view this email in a browser

 logo.jpg

Have you seen our Open MIKE Podcast series? 

The Open MIKE Podcast is a video podcast show, hosted by Jim Harris, which discusses aspects of the MIKE2.0 framework, and features content contributed to MIKE 2.0 Wiki ArticlesBlog Posts, and Discussion Forums.

We kindly invite any existing MIKE contributors to contact us if they’d like to contribute any audio or video segments for future episodes.

Check it out and feel free to check out our community overview video for more information on how to get involved with MIKE2.0.

On Twitter? Contribute and follow the discussion via the #MIKEPodcast hashtag.

Sincerely,

MIKE2.0 Community

New! Popular Content

Did you know that the following wiki articles are most popular on Google? Check them out, and feel free to edit or expand them!

What is MIKE2.0?
Deliverable Templates
The 5 Phases of MIKE2.0
Overall Task List
Business Assessment Blueprint
SAFE Architecture
Information Governance Solution

Contribute to MIKE:

Start a new article, help witharticles under construction or look for other ways to contribute.

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
images.jpg

This Week’s Blogs for Thought:

RDF PROVenance Ontology

This post features a special 3-part focus on the RDF PROVenance Ontology, a Recommendation from the Worldwide Web Consortium that is expected to be a dominant interchange standard for enterprise metadata in the immediate future, replacing the Dublin Core. Followingintroduction, reviews and commentary about the ontology’sRDF classes and RDF properties should stimulate lively discussion of the report’s conclusions.

Read more. 

Are There Alternatives to Data Hoarding? 

“In a microsecond economy,” Becca Lipman recently blogged, “most data is only useful in the first few milliseconds, or to an extent, hours after it is created. But the way the industry is collecting data, or more accurately, hoarding it, you’d think its value lasts a lifetime. Yes, storage costs are going down and selecting data to delete is no easy task, especially for the unstructured and unclassified sets. And fear of deleting something that could one day be useful is always going to be a concern. But does this give firms the go-ahead to be data hoarders?”

Read more.

Analogue Business with a Digital Facade

Everyone is talking about digital disruption and the need to transform their company into a “digital business”.  However, ask most people what a digital business is and they’ll talk in terms of online shopping, mobile channels or the latest wearable device.

In fact we have a long tradition of using the word “digital” as a prefix to new concepts while we adapt.  Some examples include the wide introduction from the 1970s of the “digital computer” a term which no longer needs the digital prefix.Read more.

 

 

 

Category: Information Development
No Comments »

by: Ocdqblog
27  Aug  2014

Are there alternatives to Data Hoarding?

“In a microsecond economy,” Becca Lipman recently blogged, “most data is only useful in the first few milliseconds, or to an extent, hours after it is created. But the way the industry is collecting data, or more accurately, hoarding it, you’d think its value lasts a lifetime. Yes, storage costs are going down and selecting data to delete is no easy task, especially for the unstructured and unclassified sets. And fear of deleting something that could one day be useful is always going to be a concern. But does this give firms the go-ahead to be data hoarders?”

Whether we choose to measure it in terabytes, petabytes, exabytes, HoardaBytes, or how harshly reality bites, we have been hoarding data long before the data management industry took a super-sized tip from McDonald’s and put the word “big” in front of its signature sandwich. At least McDonald’s starting phasing out their super-sized menu options in 2004, stating the need to offer healthier food choices, a move that was perhaps motivated in some small way by the success of Morgan Spurlock’s Academy Award Nominated documentary film Super Size Me.

Much like fast food is an enabler for our chronic overeating and the growing epidemic of obesity, big data is an enabler for our data hoarding compulsion and the growing epidemic of data obesity.

Does this data smell bad to you?

Are there alternatives to data hoarding? Perhaps we could put an expiration date on data, which after it has been passed we could at least archive, if not delete, the expired data. One challenge with this approach is that with most data you can not say exactly when it will expire. Even if we could, however, expiration dates for data might be as meaningless as the expiration dates we currently have for food.

As Rose Eveleth reported, “these dates are—essentially—made up. Nobody regulates how long milk or cheese or bread stays good, so companies can essentially print whatever date they want on their products.” Eveleth shared links to many sources that basically recommended ignoring the dates and relying on seeing if the food looks or smells bad.

What about regulatory compliance?

Another enabler of data hoarding is concerns about complying with future regulations. This is somewhat analogous to income tax preparation in the United States where many people hoard boxes of receipts for everything in hopes of using them to itemize their tax deductions. Even though most of the contents are deemed irrelevant when filing an official tax return, some people still store the boxes in their attic just in case of a future tax audit.

How useful would data from this source be?

Although calculating the half-life of data has always been problematic, Larry Hardesty recently reported on a new algorithm developed by MIT graduate student Dan Levine and his advisor Jonathan How. By using the algorithm, Levine explained, “the usefulness of data can be assessed before the data itself becomes available.” Similar algorithms might be the best future alternative to data hoarding, especially when you realize that the “keep it just in case we need it” theory often later faces the “we can’t find it amongst all the stuff we kept” reality.

What Say You?

Have you found alternatives to data hoarding? Please share your thoughts by leaving a comment below.

 

Category: Enterprise Data Management, Information Governance
No Comments »

by: Robert.hillard
23  Aug  2014

Analogue business with a digital facade

Everyone is talking about digital disruption and the need to transform their company into a “digital business”.  However, ask most people what a digital business is and they’ll talk in terms of online shopping, mobile channels or the latest wearable device.

In fact we have a long tradition of using the word “digital” as a prefix to new concepts while we adapt.  Some examples include the wide introduction from the 1970s of the “digital computer” a term which no longer needs the digital prefix.  Similarly the “digital mobile phone” replaced its analogue equivalent in the 1990s introducing security and many new features including SMS.  The scandals caused when radio hams listened into analogue calls made by politicians seem like a distant memory!

The term digital really just refers to the use of ones and zeros to describe something in a way that is exactly repeatable rather than an inexact analogue stream.  Consider, for instance, the difference between the ones and zeros used to encode the music in an audio file, which will replay with the same result now and forever in the future, compared to the physical fluctuations in the groove of an old vinyl record which will gradually degrade.  Not only is the audio file more robust it is also more readily able to be manipulated into new and interesting combinations.

Digital business

So it is with digital business.  Once the economy has successfully made the transition from analogue to digital, it will be natural for all business to be thought of in this way.  Today, however, too many people put a website and mobile app over the top of their existing business models and declare the job done.

Their reason for doing this is that they don’t actually understand what a digital business is.

Digital business separates its constituent parts to create independent data and processes which can then be rapidly assembled in a huge number of new and innovative ways.  The move to digital business is actually just a continuation of the move to the information economy.  We are, in fact, moving to the Information-Driven Business that puts information rather than processes at its core.

Airlines are a good example.  Not that many years ago, the process of ticketing through to boarding a flight was analogue meaning that each step led to the next and could not be separated.  Today purchasing, ticketing and boarding a flight are completely independent and can each use completely different processes and digital technology without impacting each other.  Passenger handling for airlines is now a digital business.

What this means is that third parties or competing internal systems can work on an isolated part of the business and find new ways of adding value.  For retailers this means that the pictures and information supporting products are independent of the website that presents them and certainly the payment processes that facilitate customer transactions.  A digital retailer has little trouble sharing information with new logistics, payments and mobile providers to quickly develop more efficient or new routes to market.

The digital facade

In the 1970s and 1980s businesses were largely built-up by using thousands of processes.  Over time, automation has allowed these numbers to explode.  When processes required clerical functions the number of options was limited by available labour.  With automation, every issue can be apparently solved by adding a process.

Where digital business is about breaking activities up into discrete parts which can be reassembled, analogue business tends to be made up of processes which are difficult or impossible to break apart.

The problem with this is that the organisation becomes a maze of processes.  They become increasingly interdependent, to the point where it is impossible to break them apart.

Many businesses have put mobile and web solutions over the top of this maze.  While the result can look fantastic, it doesn’t take long before the wheels fall off.  Customers experience inconsistent product, delivery or price options depending on whether they ring the call centre or look online.  They find that the promised stock is no longer available because the warehouse processes are not properly integrated with the online store.  Without seamless customer information, they aren’t addressed with the same premium privileges or priority across all channels.

In many cases, the process maze means that the addition of a digital façade can do more harm than good.

Ironically, the reverse model of having a truly digital business with an analogue interface to the customer is not only valid but often desirable.  Many customers, particularly business clients, are themselves dependent upon complex processes in their own organisations and it makes perfect sense to work with them the way that they want to work.  An airline that is entirely digital in its core can still deal with corporate travel agents who operate in an analogue world.

Digital transformation from the inside out

I have previously argued that the technology infrastructure that supports digital business, cloud, is actually the foundation for business transformation (see Cloud computing should be about new business models).

Every twenty-first century business needs to define itself in terms of its information assets and differentiated intellectual property.  Business transformation should focus on applying these to the benefit of all of its stakeholders including customers, staff, suppliers and shareholders.

Starting from the core and building a new enterprise around discrete, digital, business capabilities is a big exercise.  The alternative, however, is to risk being pulled under in the long-term by the weight of complexity.

No matter how many extra interfaces, interim processes or quick fixes are put in place, any business that fails in this transformation challenge will ultimately be seen as offering no more than a digital façade on a fading analogue business.

Category: Enterprise2.0, Information Development, Web2.0
No Comments »

by: Gil Allouche
20  Aug  2014

Lessons from How Amazon Uses Big Data

Entering the field of big data is no cake walk. Due to the many nuances of big data architecture combined with the fact that it’s so new, relatively speaking, it can be overwhelming for many executives to implement. Or, instead of being overwhelming, there’s simply a lack of experience and understanding. That misunderstanding leads management, all too often, to use big data inefficiently.

One of the best ways for companies to learn about big data and how they can effectively implement it is by analyzing those who have used big data successfully for years. Amazon.com is one of those companies.

There’s no doubting the data expertise of Amazon.com. One of the key innovators in big data technology, the global giant has given us lesson after lesson on how to successfully gather, analyze and then implement data analytics. Not only has it effectively used big data for it’s own purposes, but with services like Amazon Elastic MapReduce, the company has successfully leveraged its own data use to help others.

Amazon is full of lessons on how to successfully use big data, here are a few –

Focus on the Customer


One of Amazon’s premier uses of big data has been customer recommendations. If you have an Amazon account and you take a quick look at your Amazon home page you’ll notice as you scroll down there are recommendations based on your browsing history, additional recommendations, sale items based on what you’ve bought and searched for in the past. While this type of things occurs frequently today, Amazon was one of the first companies to do this.

Amazon has put a focus on using it’s big data to give its customers a personalized and focused buying experience. Interestingly, by giving customers this personalized experience, the customer tends to buy more than they would otherwise. It’s a simple solution for many problems.

For companies implementing big data, a key focus needs to be the consumer. If companies want to succeed in big data or at all, the consumer has to come first. The more satisfied they are, the better off you’ll be.

Be Focused


It’s impossible to know all Amazon’s uses of big data. Still, though, another lesson we can learn from the online retailing giant is to have an extreme focus on big data gathering and use.

Amazon gathers extremely large amounts of data each second, let alone each day and year. At that rate it would be easy to lose focus on what data is being gathered, why it’s being gathered, and how exactly it can help the customer. But, Amazon doesn’t let that happen. It’s very strategic  both in gathering data and implementing changes and upgrades because of that data.

Too many companies let big data overwhelm them. They don’t have a clear focus when they begin, and they never end up obtaining one. The data they’ve gathered goes to waste and they completely miss the opportunity and potential.

Big Data Works


Amazon is one of the most successful companies today. It’s a true global empire. Consumers shop on Amazon for everything. They are leaders in the e-book, e-reader and tablet industries and they’ve recently entered the foray into TV boxes and phones.

Behind all this success is a rock-solid determination to gather data and use it efficiently. It’s gone where other companies were afraid to go and it’s achieved success other companies wish they had.

Among many contributing factors, Amazon has leveraged its big data expertise in extremely innovative and effective ways. It’s taken big data to the next level. It has shown — time and again — that big data works. It’s shown that if companies want to take their operations and success to the next level then big data is a key component.

Make Big Data Work for You


Amazon is a great example of big data use. It’s not necessarily about the size of the company or the size of the data that’s most important. As Amazon has illustrated, it’s about tailoring to the customer’s needs, being focused and having a plan and actually using the technology. Big data works for Amazon, now make it work for you.

Category: Business Intelligence
No Comments »

by: Bsomich
16  Aug  2014

Community Update.

Missed what’s been happening in the MIKE2.0 community? Check out our bi-weekly update:

 

 logo.jpg

Big Data Everywhere: How are you managing it? 

Our Big Data Solution Offering provides an approach for storing, managing and accessing data of very high volumes, variety or complexity.

Storing large volumes of data from a large variety of data sources in traditional relational data stores is cost-prohibitive, and regular data modeling approaches and statistical tools cannot handle data structures with such high complexity. This solution offering discusses new types of data management systems based onNoSQL database management systems and MapReduce as the typical programming model and access method.

Read our Executive Summary for an overview of this solution.

We hope you find this new offering of benefit and welcome any suggestions you may have to improve it.

Sincerely,

MIKE2.0 Community

Popular Content

Did you know that the followingwiki articles are most popular on Google? Check them out, and feel free to edit or expand them!

What is MIKE2.0?
Deliverable Templates
The 5 Phases of MIKE2.0
Overall Task List
Business Assessment Blueprint
SAFE Architecture
Information Governance Solution

Contribute to MIKE:

Start a new article, help witharticles under construction or look for other ways to contribute.

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
 images.jpg

 

This Week’s Blogs for Thought:

How Big Data Can Help the Sales Team

With technology playing such a prominent role in businesses today, people in all fields are being impacted in new, exciting ways. Perhap one field that is dealing with some particularly big challenges is sales. For decades, sales teams have operated under certain strategies that have proven effective, but with the rise of the internet and social media, the balance of power between salesperson and customer has subtly yet noticeably shifted.

Read more.

What Movie Ratings Teach Us About Data Quality

In previous posts on this blog, I have discussed aspects of data quality using travel reviews, the baseball strike zoneChristmas songs, the wind chill factor, and the bell curve.
Grab a big popcorn and a giant soda since this post discusses data quality using movie ratings.

Note: The Following Post has been Approved for All Audiences

Read more.
A Look at Cyber Security Trends for 2014

We’re now more than halfway through 2014, and as with any year, the world of technology has been rapidly progressing and evolving. This year, there’s been more discussion than ever about numerous topics such as the benefits of big data, the Internet of Things, mobile technology, and how to make the most of cloud computing. There’s plenty of excitement to be had so far and much more on the way, but in the fast moving technological environment we now live in, there’s also reason to worry. Security in particular, whether it’s network security, computer security, or IT security, is foremost on many business leaders’ minds. To prepare for what the future may hold, it’s important to look back at some of the recent trends to see the threats and solutions having the biggest impact on cyber security.

Read more.

Forward this message to a friendQuestions?

If you have any questions, please email us at mike2@openmethodology.org.

 

Category: Information Development
No Comments »

by: RickDelgado
14  Aug  2014

A Look at Cyber Security Trends for 2014

We’re now more than halfway through 2014, and as with any year, the world of technology has been rapidly progressing and evolving. This year, there’s been more discussion than ever about numerous topics such as the benefits of big data, the Internet of Things, mobile technology, and how to make the most of cloud computing. There’s plenty of excitement to be had so far and much more on the way, but in the fast moving technological environment we now live in, there’s also reason to worry. Security in particular, whether it’s network security, computer security, or IT security, is foremost on many business leaders’ minds. To prepare for what the future may hold, it’s important to look back at some of the recent trends to see the threats and solutions having the biggest impact on cyber security.

 

Securing Internet Connections

 

Perhaps one of the biggest movements to happen in recent months is the expansion of devices now connected to the internet. While this can be seen through the adoption of smartphones and tablets all over the world, it also applies to other everyday objects that now find themselves with web access. That expansion is only expected to increase over time, with the number of internet-connected objects predicted to explode from 10 billion today to 50 billion by 2020. Many are using the term the Internet of Things to explain the phenomenon, and while it opens up innovative new options for making life easier and more connected, it does lead to a greater attack surface for attackers to take advantage of. That’s why companies are looking to make the Internet of Things more secure, but not by simply expanding traditional security procedures, which would prove ineffective. One method aims to reduce that amount of attack surface, limiting the possibilities of an infiltration. The method includes using some basic defensive measures such as frequent software patching, advanced user identity and network management, and the elimination of infrastructure dark space. These strategies can end up reducing attack surface by as much as 70%.

 

Cloud Security

 

In the past few years, businesses have begun truly utilizing cloud computing in new ways. Now more than ever, cloud providers are offering new services that can help companies be more efficient and productive. But as businesses move to the cloud, so are attackers. The reason for this is that with movement to the cloud, businesses will often send their corporate data there as well. Cloud security is very much a work in progress, and attackers have been eager to infiltrate the cloud to steal not just business data but people’s personal data as well. Attackers may in fact hold sensitive data for ransom, sort of like blackmail, in order to extract value of their own from it. Cloud vendors will need to provide stronger password capabilities and reinforced cloud data access policies to ensure this doesn’t happen.

 

Increased Mobile Malware

 

Nearly everyone has a smartphone these days, and this fact has not gone unnoticed by attackers. While smartphones are certainly convenient, they are also frighteningly vulnerable. One study shows 80% of smartphones have no malware protection at all, which makes them a prime target for cybercriminals looking to gain access to them. The amount of malware aimed at iPhones and Android devices is growing exponentially, as is the number of devices that have been infected. Of particular concern is the increase in Android malware, but whatever device you use, securing mobile technology will take time. Improvement are already being made, but it will take time before they become a common feature on smartphones.

 

(Tweet This: One study show 80% of smartphones have no malware protection. #mobile #security)

 

Third Party Security

 

Cyber security is also being made a much more important priority for third party organizations. You’ve likely heard of the massive security breach that hit Target, costing the mega-corporation tens of millions of dollars, not to mention compromising sensitive information for millions of customers. The attackers were able to gain access to Target’s systems by infiltrating a third party organization, which already had access to the Target network. Breaching the third party made access to the larger internal system much easier. With this damaging breach, companies are now working harder than ever to secure their supply chains, with more emphasis being placed on increasing security for third parties. The process to do this won’t be easy, but as seen in Target’s case, the alternative is simply too costly.

 

Security will never be perfect. Businesses will have to be constantly vigilant as they search for attackers intending to inflict harm and steal data. While no security measure can deal with all present and future threats flawlessly, companies are working hard to make sure cyber security is ready to meet these challenges. As security improves, businesses and individuals can rest a little easier knowing their information is protected.

 

Category: Business Intelligence
No Comments »

by: Ocdqblog
12  Aug  2014

What Movie Ratings teach us about Data Quality

In previous posts on this blog, I have discussed aspects of data quality using travel reviews, the baseball strike zoneChristmas songs, the wind chill factor, and the bell curve.

Grab a big popcorn and a giant soda since this post discusses data quality using movie ratings.

The Following Post has been Approved for All Audiences

In the United States prior to 1922, every state and many cities had censorship boards that prevented movies from being shown in local theaters on the basis of “immoral” content. What exactly was immoral varied by censorship board.

In 1922, the Motion Picture Association of America (MPAA), representing all the major American movie studios, was formed in part to encourage the movie industry to censor itself. Will Hays, the first MPAA president, helped develop what came to be called the Hays Code, which used a list of criteria to rate movies as either moral or immoral.

For three decades, if a movie failed the Hays Code most movie theaters around the country would not show it. After World War II ended, however, views on movie morality began to change. Frank Sinatra received an Oscar nomination for his role as a heroin addict in the 1955 drama The Man with the Golden Arm. Jack Lemmon received an Oscar nomination for his role as a cross-dressing musician in the 1959 comedy Some Like It Hot. Both movies failed the Hays Code, but were booked by movie theaters based on good reviews and became big box office hits.

Then in 1968, a landmark Supreme Court decision (Ginsberg v. New York) ruled that states could “adjust the definition of obscenity as applied to minors.” Fearing the revival of local censorship boards, the MPAA created a new rating system intended to help parents protect their children from obscene material. Even though the ratings carried no legal authority, parents were recommended to use it as a guide in deciding what movies their children should see.

While a few changes have occurred over the years (most notably adding PG-13 in 1984), these are the same movie ratings we know today: G (General Audiences, All Ages Admitted), PG (Parental Guidance Suggested, Some Material may not be Suitable for Children), PG-13 (Parents Strongly Cautioned, Some Material may be Inappropriate for Children under 13), R (Restricted, Under 17 requires Accompanying Parent or Adult Guardian), and NC-17 (Adults Only, No One 17 and Under Allowed). For more on these ratings and how they are assigned, read this article by Dave Roos.

What Movie Ratings teach us about Data Quality

Just like the MPAA learned with the failure of the Hays Code to rate movies as either moral or immoral, data quality can not simply be rated as good or bad. Perspectives about the quality standards for data, like the moral standards for movies, changes over time. For example, consider how big data challenges traditional data quality standards.

Furthermore, good and bad, like moral and immoral, are ambiguous. A specific context is required to help interpret any rating. For the MPAA, the specific context became rating movies based on obscenity from the perspective of parents. For data quality, the specific context is based on fitness for the purpose of use from the perspective of business users.

Adding context, however, does not guarantee everyone will agree with the rating. Debates rage over the rating given to a particular movie. Likewise, many within your organization will disagree with the quality assessment of certain data. So the next time your organization calls a meeting to discuss its data quality standards, you might want to grab a big popcorn and a giant soda since the discussion could be as long and dramatic as a Peter Jackson trilogy.

 

Category: Data Quality
No Comments »

by: RickDelgado
07  Aug  2014

How Big Data Can Help the Sales Team

With technology playing such a prominent role in businesses today, people in all fields are being impacted in new, exciting ways. Perhap one field that is dealing with some particularly big challenges is sales. For decades, sales teams have operated under certain strategies that have proven effective, but with the rise of the internet and social media, the balance of power between salesperson and customer has subtly yet noticeably shifted. Not only is there more information available to customers, but sales teams now have access to unprecedented amounts of data. In fact, in one survey more than 80% of salespeople said they feel challenged by how much data is out there and the amount of time it takes to research what they need. With these challenges out there, sales teams need help in establishing relationships with more promising customers and getting more sales opportunities. Big data may in fact be the key to reaching these goals, while more answers may lie in unexpected places.

 

(Tweet This: One survey says 82% of salespeople are challenged by the amount of #bigdata out there. #sales)

 

The way salespeople interact with prospective customers has changed drastically in the past few years. Traditionally, customers would contact a business while in search of a product or service and be put in touch with a sales representative, who would in turn have all the information and answers for the customer. That’s rarely how things work anymore. Customers are now able to get all the information they need through internet sites, social media, and other networks that weren’t available in previous decades. That means they are aware of competitors’ information, company or product weaknesses, and what close friends and associates are saying. That puts salespeople at a disadvantage, but big data can help with this problem. In this case, the answers can come from the marketing team. For years, marketers have been collecting data on customers, discovering their interests, passions, and values, all while gathering what insights they can on what motivates them. Relaying this data to sales teams can help salespeople better understand potential customers, getting to know them and being able to respond to their concerns even if customers don’t voice them.

 

Big data can also help sales teams identify the prospects that are the most promising. This can be done through predictive analytics–applying big data to better predict which customers are most interested and will respond more positively to a sales pitch. Big data allows businesses to analyze each account they have and correctly pick the right time and method for dealing with them. This strategy has the potential to lead to some impressive results. In one case, business outsourcing company ADP’s sales team used big data tools for a year. The results was 52% more sales opportunities for the company along with an increase in sales productivity of 29%.

 

The results from using big data have many companies rushing to implement their own big data strategies. Luckily, there are plenty of big data sales apps and tools to choose from. Most of them revolve around integrating with a customer relationship management (CRM) platform designed specifically for sales divisions. These sales apps have plenty of advantages, including capabilities in predictive lead prioritization, predictive lead scoring, and other areas. As more companies utilize advances in flash array storage technology, the tools will become more available and cost-effective. These apps and tools, when utilizing big data properly, can help develop and foster sales relationships and identify changes in a buyer’s behavior, helping sales teams respond in real time to these changes along with fluctuations in the sales cycle. Only through big data can these apps parse the data and recognize the patterns to make these accurate predictions.

 

Another helpful tool that big data provides is the ability to evaluate members of the sales team. In much the same way big data can determine the most promising customers and how to approach them, the performance of a sales representative can also be analyzed to better see who is meeting expectations, who is doing better, and who is not meeting the established standards. By identifying where salespeople need improvement, managers can create more personalized training programs, effectively getting rid of a blanket approach that isn’t the best way to handle an entire sales team. This contextual coaching can even be evaluated based on performances in real time.

 

There is still much progress to be made for sales teams looking to make use of big data. The potential is there, and those that have taken advantage of it are already seeing the great results. With a more effective sales team, businesses will prosper and be more productive, gaining new customers all the time while keeping the customers they already have.

 

Category: Business Intelligence, Enterprise Data Management
No Comments »

by: Bsomich
02  Aug  2014

Community Update.

Missed what’s been happening in the MIKE2.0 community? Check out our latest update:

 logo.jpg

Why MIKE2.0? The Need for Better Information Management

The era of ubiquitous technology has led to a new challenge – managing ubiquitous information. We now find that almost every organisation has a major challenge with managing their data:

  • Data quality is at the core of regulatory and customer pressures
  • Many system failures are largely data, not technology-driven and can only be mitigated by data rectification
  • Better business intelligence is at the core almost every major business recommendation
  • Data Warehousing is now mainstream, not an afterthought, and is at the core of data convergence strategies
  • A successful approach to IT Transformation is data-dependent

The MIKE2.0 Methodology has been built to support our belief that information really is one of the most crucial assets of a business. We believe meaningful, cost-effective Business and Technology processes can only be achieved with a successful approach for managing information Management that we callInformation Development. As a starting point to understanding MIKE2.0 you may want to review the following:

We hope that you will choose to become a Contributor in this project and help make MIKE2.0 a truly collaborative and successful initiative.

Sincerely,

MIKE2.0 Community

New! Popular Content

Did you know that the followingwiki articles are most popular on Google? Check them out, and feel free to edit or expand them!

What is MIKE2.0?
Deliverable Templates
The 5 Phases of MIKE2.0
Overall Task List
Business Assessment Blueprint
SAFE Architecture
Information Governance Solution

Contribute to MIKE:

Start a new article, help witharticles under construction or look for other ways to contribute.

Update your personal profile to advertise yourself to the community and interact with other members.

Useful Links:
Home Page
Login
Content Model
FAQs
MIKE2.0 Governance

Join Us on
42.gif

Follow Us on
43 copy.jpg

Join Us on
images.jpg

 

 

This Week’s Blogs for Thought:

Do You Know Good Information When You Hear It? 

When big data is discussed, we rightfully hear a lot of concerns about signal-to-noise ratio. Amidst the rapidly expanding galaxy of information surrounding us every day, most of what we hear sounds like meaningless background static. We are so overloaded but underwhelmed by information, perhaps we are becoming tone deaf to signal, leaving us hearing only noise.
Before big data burst onto the scene, bursting our eardrums with its unstructured cacophony, most of us believed we had data quality standards and therefore we knew good information when we heard it. Just as most of us believe we would know great music when we hear it.

Read more.

 

Your Insight Might Protect Your Job

Technology can make us lazy.  In the 1970s and 80s we worried that the calculator would rob kids of insight into the mathematics they were learning.  There has long been evidence that writing long-hand and reading from paper are far superior vehicles for absorbing knowledge than typing and reading from a screen.  Now we need to wonder whether that ultimate pinnacle of humanity’s knowledge, the internet, is actually a negative for businesses and government.

Read more.

How Machine Learning is Improving Computer Security

If there’s one thing that keeps business leaders awake at night, it’s worries over data security. Nowadays, every company no matter the size uses technology in their operations, whether its using cloud systems for emails, massive server rooms for handling online transactions, or simply allowing employees to access company information on their smartphones. One misstep could end up leading to data loss or even data theft, which could end up costing the company some big money. Even mega-corporations like Target aren’t immune to this unfortunate trend. Businesses are looking for ways to make their information more secure, so to do that, many security systems are turning to big data, or more specifically to machine learning as a way to prevent and combat threats.

Read more.

Forward to a Friend!

Know someone who might be interested in joining the Mike2.0 Community? Forward this message to a friendQuestions?

If you have any questions, please email us at mike2@openmethodology.org.

 

 

 

Category: Information Development
No Comments »

by: Ocdqblog
29  Jul  2014

Do you know good information when you hear it?

When big data is discussed, we rightfully hear a lot of concerns about signal-to-noise ratio. Amidst the rapidly expanding galaxy of information surrounding us every day, most of what we hear sounds like meaningless background static. We are so overloaded but underwhelmed by information, perhaps we are becoming tone deaf to signal, leaving us hearing only noise.

Before big data burst onto the scene, bursting our eardrums with its unstructured cacophony, most of us believed we had data quality standards and therefore we knew good information when we heard it. Just as most of us believe we would know great music when we hear it.

In 2007, Joshua Bell, a world-class violinist and recipient of that year’s Avery Fisher Prize, an award given to American musicians for outstanding achievement in classical music, participated in an experiment for The Washington Post columnist Gene Weingarten, whose article about the experiment won the 2008 Pulitzer Prize for feature writing.

During the experiment, Bell posed as a street performer donned in jeans and a baseball cap and played violin for tips in a Washington, D.C. metro station during one Friday morning rush hour. For 45 minutes he performed 6 classical music masterpieces, some of the most elegant music ever written on one of the most valuable violins ever made.

Weingarten described it as “an experiment in context, perception, and priorities—as well as an unblinking assessment of public taste: In a banal setting at an inconvenient time, would beauty transcend? Each passerby had a quick choice to make, one familiar to commuters in any urban area where the occasional street performer is part of the cityscape: Do you stop and listen? Do you hurry past with a blend of guilt and irritation, aware of your cupidity but annoyed by the unbidden demand on your time and your wallet? Do you throw in a buck, just to be polite? Does your decision change if he’s really bad? What if he’s really good? Do you have time for beauty?”

As Bell performed, over 1,000 commuters passed by him. Most barely noticed him at all. Very few stopped briefly to listen to him play. Even fewer were impressed enough to toss a little money into his violin case. Although three days earlier he had performed the same set in front of a sold out crowd at Boston Symphony Hall where ticket prices for good seats started at $100 each, at the metro station Bell earned only $32.17 in tips.

“If a great musician plays great music but no one hears,” Weingarten pondered, “was he really any good?”

That metro station during rush hour is an apt metaphor for the daily experiment in context, perception, and priorities facing information development in the big data era. In a fast-paced business world overcrowded with fellow travelers and abuzz with so much background noise, if great information is playing but no one hears, is it really any good?

Do you know good information when you hear it? How do you hear it amidst the noise surrounding it?

 

Category: Data Quality, Information Development
No Comments »

Calendar
Collapse Expand Close
TODAY: Tue, September 2, 2014
September2014
SMTWTFS
31123456
78910111213
14151617181920
21222324252627
2829301234
Recent Comments
Collapse Expand Close