Posts Tagged ‘Apple’
“One man’s data is another man’s metadata”
As I pen these words, the PRISM scandal continues to unfold. The questions raised by the National Security Administration’s formerly furtive program strike at the very heart of a free society.
The fallout will continue for months, if not years. Maybe it will spark a deeper conversation about data ownership. Perhaps more people will echo the words of Jim Harris, who wrote recently on this site:
The Metadata Cop-Out?
I for one noticed something interesting buried in many of the non-denial denials, the carefully scripted and lawyer-approved statements from Microsoft, Apple, Yahoo!, Microsoft, Facebook, and others. Many press releases claimed (truthfully, for all I know) that these companies didn’t provide data per se to the NSA. Rather, they provided metadata. In other words, Yahoo! didn’t give up the actual contents of any single email, just things like:
- the sender’s email address
- the receiver’s email address
- the subject of the email
- the time and date that the email was sent
So, what is this distinction between data and metadata? And does it ultimately matter?
I discussed this very subject recently with my friend Melinda Thielbar, a real-life statistician and data scientist. She agreed with me that the distinction is becoming “essentially meaningless.” Equipped with enough of (the right) metadata, one can more or less figure out what’s going on–or at least identify potentially suspicious communications among persons of interest.
The quote at the beginning of this post is as true as its ever been. In a world of Big Data, metadata is increasingly important. It’s not just the video, picture, blog post, email, or customer record that matters. The data about or “behind” the data can be just as critical.
Is your organization paying attention to its metadata?
Half the money I spend on advertising is wasted; the trouble is I don’t know which half.
Executive turnover has always fascinated me, especially as of late. HP’s CEO Leo Apotheker had a very short run and Yahoo! has been a veritable merry-go-round over the last five years. Beyond the CEO level, though, many executive tenures resemble those of Spinal Tap drummers. For instance, CMOs have notoriously short lifespans. While the average tenure of a CMO has increased from 23.6 to 43 months since 2004, it’s still not really a long-term position. And I wonder if Big Data can change that.
In a recent article for Chief Marketer, Wilson Raj the global customer intelligence director of SAS, writes about the potential impact of Big Data and CMOs. From the piece:
CMOs today are better poised than ever not only to retain their roles, but to deliver broad, sustainable business impact. CMOs who capitalize on big data will reap big rewards, both personally and professionally. Bottom line: Businesses that exploit big data outperform their competition.
Necessary vs. Sufficient Conditions
The potential of Big Data is massive. To realize it to an optimal level, however, organizations need to effectively integrate transactional and analytical data and systems. Lamentably, many organizations are nowhere close to being able to do this. That is, for every Quantcast, Amazon, Target, and Wal-Mart, I suspect that dozens or even hundreds of organizations continue to struggle with what should be fairly standard blocking and tackling. Data silos continue to plague many if not most mature organizations.
Utilizing Big Data in any meaningful way involves a great deal more than merely understanding its importance. Big Data requires deploying new solutions like NoSQL databases, Hadoop, Cassandra, and others. Only then will CMOs be able to determine the true ROI of their marketing efforts. That is, accessing and analyzing enterprise and external (read: social) information guarantees nothing. A CMO will not necessarily be able to move the needle just because s/he has superior data. (Microsoft may have all of the data in the world, but so what? Bing hasn’t made too many inroads in the search business and Surface isn’t displacing the iPad anytime soon.)
Think of access to information as a necessary but insufficient condition to ensure success. As I look five and ten years out, I see fewer and fewer CMOs being able to survive on hunches and standard campaigns. The world is just moving too fast and what worked six months ago may very well not work today.
Some believe that Big Data represents the future of marketing. I for one believe that Big Data and related analytics can equip organizations with extremely valuable and previously unavailable information. And, with that information, they will make better decisions. Finally marketers will be able to see what’s really actually going on with their campaigns. Perhaps problems like the one mentioned at the beginning of this post can finally be solved.
What say you?
At least to me, sometimes Big Data often seems like a bit of an amorphous term. Just what exactly is it, anyway?
Consider the following statistics from the Gang of Four:
- Amazon: Excluding books, the company sells 160 million products on its website. Target sells merely 500,000. Amazon’s reported to have credit cards on file for 300 million customers. 300 million. For more Amazon stats, click here.
- Apple: The company a few months ago passed 25 billion app downloads.
- Facebook: 954 million registered users share more than one billion pieces of content every day.
- Google: As of two years ago, Google handled 34,000 searches per second.
These numbers are nothing less than mind-blowing. While Facebook’s rate of growth seems to be waning, make no mistake: it’s still growing. (Deceleration of growth shouldn’t be confused with deceleration.)
While we’re at it, let’s look at some of Twitter’s numbers. On the company’s five-year anniversary, the company posted the following numbers:
- 3 years, 2 months and 1 day. The time it took from the first tweet to the billionth tweet.
- 1 week. The time it now takes for users to send a billion tweets.
- 50 million. The average number of tweets people sent per day, one year ago.
- 140 million. The average number of tweets people sent per day, in the last month.
- Oddly, 80 percent of all tweets involve Charlie Sheen.
OK, I’m making the last one up, but you get my drift.
A few things strike me about these numbers. First, this is a staggering amount of data. Second, all of this data is kept somewhere. To varying extents, these companies and others are turning data into information and, ultimately knowledge.
What they do with that knowledge varies, but no one can doubt the potential of so much data–even if much of it is noise. Another issue: will people continue to use ad-supported platforms? Will we become sick of having our data sold to the highest bigger? Or, will private, ad-free platforms like app.net flourish?
Even if the latter is true, those private platforms will still be generating data. So, in a way, the explosion of data does not hinge upon the continued growth of open or “somewhat-open” platforms
If you think that consumers are going to be generating and using less data in the upcoming years, you’re living in an alternate reality. Take steps now to ensure that your organization has the software, hardware, and human capabilities to handle vastly increasing amounts of data.
What say you?
In Part V of this series, I provided an example of an organization in dire need of some Applefication. In this concluding part, I look at what could derail Apple’s charge in the enteprise.
Challenges and the Future
For a wide variety of reasons, not every enterprise is ready to embrace Applefication. Beyond the cost of buying and deploying new technologies, significant issues still to be addressed. As I look toward the future, consider a few things that may hinder Apple’s growth in the enterprise.
First, Apple may well nowhere to go but down. What’s more, the company might have a hard time meeting demand for its products, especially when natural disasters take place (read: Thailand) Beyond that, in a sense nothing has changed: Organizations that insist upon superfluous complexity will certainly have it. Buying iPads doesn’t fix broken companies.
More generally, as others have pointed out, culture eats strategy for lunch. It isn’t easy to convince old-school IT employees and departments that simple is better—when their jobs depend upon complex. And not every organization has the budget for pricier (if superior) iProducts. Sometimes, good enough is exactly that, particularly in cash-strapped and low-margin industries.
And let’s not forget forthcoming product introductions from companies like Microsoft. Now that Apple has led the way with tablets and smart phones, expect forthcoming improvements from more traditional enterprise vendors.
Platforms Guarantee Nothing
Perhaps the most significant challenge to Apple is Apple itself–specifically, the tendency for successful companies do the following:
- become complacent
- ignore The Innovator’s Dilemma
- misunderstand its ecosystem.
These first two have been well studied and documented. With respect to the third, Ron Adner recently wrote an excellent piece in HBR. To me, the key piece is this:
In the rush to match the pieces, most of Apple’s rivals have missed the critical connections that draw the entire ecosystem together into a coherent whole.
What if Apple loses touch with its ecosystem? Likely? No. Possible. You better believe it. Look at RIM. Some claim that RIM has really done nothing wrong; it has merely been surpassed. Today, its smartphone market share has dropped to 12 percent (although it’s probably much higher with enterprises.) More alarming, that number may plummet further.
Even if RIM develops sleek new product, the company’s apps are anything but cutting edge. Ecosystems are arguably just as important as the products they support.
As discussed in this series, Apple is firing on all cylinders these days. It clearly understands the power of its platform and ecosystem , the consumerization of IT, and the criticality of an optimal end-user experience.
The platform business model fundamentally differs from other, more internally based business models. A company can do everything “right” (read: strategy and execution) and still fall from grace because its ecosystem changes. To guarantee Apple’s continued success in the enterprise would be the acme of foolishness. In at least the short and mid terms, however, expect more large organizations to go Apple.
In Part IV of this series, I provided an example of an organization in dire need of some Applefication. In this concluding part, I look at Apple’s ecosystem.
Think about what has happened over the last five years in the technology world. In a word, the developments have been amazing. Trends and events that we are only beginning to comprehend include:
- The consumerization of IT has changed the game. Period.
- iPhones are replacing Blackberries, even in many conservative organizations.
- Apple stores have redefined retail, bringing unprecedented levels of energy to malls.
- iTunes’ one-click purchasing saved the music industry–and iBooks is on track to do the same thing with college textbooks.
- 3-year olds use iPads.
- iPads are replacing traditional laptops for many employees.
- Private app stores are emerging.
We’ve seen the start of an important emerging trend: the Applefication of the enteprise. And if you think that Applefication is confined to knowledge and white-collar workers, think again. Even blue collar workers are increasingly using iPads and iPhones in the workplace.
How Apple is Conquering the Enterprise
While more expensive than their alternatives, Apple products are worth a premium in the eyes of many consumers. Credit their ease of use and popularity and elegant design. And it is this very popularity that should ensure the continued development and support of new and existing apps. Translation: Apple’s ecosystem is stronger than ever, something hardly lost on technology decision makers in large organizations.
This is critical. Imagine the horror of CIOs that bought HP TouchPads en masse in July of 2011, only to find out weeks later that HP was effectively killing the device.
Apple’s penetration of the enterprise stems from many factors. Exhibit A: Its ecosystem. The strength of Apple’s ecosystem means that enterprise apps will continue to be developed for its products–and probably at an increasing rate. Force.com and Jive software are but two examples.
Apple’s ecosystem includes–and, in fact, may center upon–the rapid deployment of apps. While apps don’t really work for complex ERP and CRM apps (yet), the AppStore model better is clearly a superior one. Launching apps requires far less IT involvement and cost relative to traditional deployments. While initially proven in the consumer space, companies like Genentech are adapting it to the enteprise world.
And the model just makes sense, especially among talented, in-demand employees–many of whom who have left jobs because they were forced to use deficient technologies.
Finally, while not a major factor, Steve Jobs’ death shed light on his genius. Today, it’s just plain hip to be associated with Apple.
As brilliantly as Apple has executed, that alone doesn’t explain the whole story. No, we have to look outward. Apple can credit a number of other external factors for its increasing enterprise penetration, including:
- End user and IT frustration with existing applications, infrastructure.
- Too many chiefs. Many IT departments are fed up with attempting to navigate complex EULAs, OEM agreements, and support issues among a cadre of vendors such as Microsoft and PC manufacturers like Lenovo.
- Disappointment with ROI on past IT projects.
- A new breed of CIOs and IT heads. These folks are less conservative and more open to new ways of doing things.
- Microsoft has fumbled the ball a few times. Vista bombed (look at its adoption rate in big companies) and the company failed to embrace cloud computing early on.
- HP hemmed and hawed on its PC business.
Of course, with respect to the tablet, until recently the iPad until recently faced no legitimate alternative. While that has changed with the success of Amazon’s Kindle Fire, the iPad is clearly a superior—if more expensive—device.
In the next part of the series, I’ll take a look at the future.
In Part III of this series, I looked at how increasingly self-reliant employees are no longer tolerating many behind-the-times’ IT departments. In this part, I provide a more in-depth example of this type of organization.
It’s May of 2011 and I’m psyched. I have finally started working with my new client, a large healthcare organization based in the midwestern United States. Let’s call it Joel Hospital here, although it’s a pseudonym. Joel has a need for a consultant with my particular skills and, for reasons unbeknownst to me, its previous two consultants did not pan out.
Joel decision makers and I have danced with each other for a few drawn-out months now, exchanging emails, discussing the cost and scope of the project, finalizing the contract, and getting to know each other. I have filled out a swath of non-disclosure forms promising not to do anything with the confidential employee data to which I will have access.
The process is beyond laborious and I bite my tongue a few times because I know how slowly many healthcare organizations move. No one would ever mistake Joel Healthcare for Facebook. I hope that my “difficult client radar” is off, although I am starting to understand why I am the third consultant Joel has called in.
At the time, I lived in New Jersey and, in order to minimize travel costs, Joel and I agree that much of the work can be performed remotely. A few weeks after signing the contract, I finally receive my log-in credentials via email and open the unusually large attachment. It clocks in at nearly 5 megabytes and is 15 pages in length. I shake my head and read the document, marveling at both its complexity and the number of things that I need to do to gain remote access to a single Joel desktop. In no particular order, I need to enter my user name, password, passcode (different than a password), dynamic four-digit personal identification number (PIN), and answer to a secret question.
I pick up the phone and call the woman who sent me the document (call her Betty here). I ask if this is for real. I’ve logged in to dozens of external networks in my professional career and, by an order of magnitude, this is the most difficult one I’ve ever encountered. The process is even more cumbersome than those of my government clients. Betty doesn’t understand why I’m complaining because, she proudly admits, she wrote the document in question. I try to tread carefully, but I can tell that my mere call and questions annoy Betty.
Did I mention that I have to clear each of these hurdles just to get to the remote desktop? I’m miffed because that desktop is just the starting point for my work, the tip of what appears to be an insurmountable iceberg. For the five or so different front- and back-end applications that I’ll need to access to complete this project, I’ll need separate user names, passwords, and Lord knows what else for each one.
Betty stood firm. This is a simple process in her eyes. (I doubt that she has seen any other companies’ remote access policies.) I know that I’m not going to win this battle. So I labor on, following every instruction to the letter, hoping to see a pot of gold at the end of the rainbow. Instead, I hear a thud and see the daunting blue screen of death (BSOD) on my computer. I reboot it and again painstakingly follow the directions.
This is not good. I am not about to repeat the process. I know Einstein’s definition of insanity.
I call my friend John who is a certified computer wizard and long-time consultant. I explain my situation and he calmly tells me that the Joel’s network adapters are outdated. This has happened to him, and he assures me that it’s not my fault.
Relieved, I call my contacts at Joel and tell them about the network adapters. It’s a problem, but at least I think that I have found its solution. They are not impressed. They think that I’m being difficult and lazy.
We’re clearly not off to a great start, and the hard part should be doing the work required by this client, not logging into its network every day praying that my computer doesn’t crash. I make my case for a different way to connect to Joel’s network. It’s not 1999. Remotely connecting to a computer can be done in many simple, secure, and quick ways. I suggest a few of them and am told quite sternly by the head mucketty-muck that “that’s not the way we do things around here.”
Suddenly, I want to find the previous consultants and take them out for a beer. I decide that it’s time to cut my losses. I resign from the project before it gets any worse. This can’t possibly end well and I value my sanity above all else. Joel personnel don’t argue with me.
In a nutshell, this little yarn represents what’s wrong with how far too many organizations use technology today. Times like these are why many employees want to go all Office Space on their computers. Think about it. I can use Google Maps to immediately look halfway across the globe but I can’t log in to a remote desktop to do actual work.
What’s wrong with this picture?
There’s good news, though. It doesn’t have to be this way. Technology can enable productivity, cost savings, and exciting innovations. Organizations just have to break old patterns, listen to employees, and embrace simplicity. In short, they have to act more like Apple.
What say you?
In the next installment of this series, I’ll take it up a level and look at some broader technological and enterprise trends.
In Part II of this series, I looked at the consumerization of IT as well as the Steve Jobs’ effect. Both have driven the Applefication of the Enterprise. In Part III, I continue explaining this recent phenomenon.
Why Tomorrow is No Longer Good Enough
Years ago, employees would dutifully call IT help desks to report problems. Analysts would open tickets, document issues, attempt to find a resolution, and eventually close the case—often to the dissatisfaction of the employee.
The trend of increase employee self-reliance is unmistakable. As I write in The Age of the Platform: How Amazon, Apple, Facebook, and Google Have Redefined Business, employees have become much more tech-savvy. They have embraced self-service with open arms, often only calling IT when they have exhausted their own efforts. In a world of constant tweets, creating or customizing your own product, hyper-accurate Google searches, it’s all about now. Many employees are used to doing their own troubleshooting—and the time required to explain a problem may exceed that of solving it. Tech-savvy employees refuse to wait for a call from IT. It’s no longer acceptable. They’ve become accustomed to doing it themselves.
At the same time, these tech-savvy, overworked, and impatient employees are less willing to tolerate inferior technology while on the clock. Thanks to the prevalence of smartphones, open source software, cloud computing, and social networks, there are more ways than ever for creative employees to fly under the radar and use technologies not sanctioned by understaffed corporate IT departments.
What’s more, the last ten years have given rise to the prosumer. The term has taken on multiple meanings—and not all of them are in synch. Duncan Riley defines the word as “a combination of producer and consumer that perfectly describe the millions of participants in the Web 2.0 revolution.” The prosumer is a professional–consumer hybrid much more comfortable with surfing the web, finding creative solutions, creating content, and solving vexing problems.
This takes us back to the simplifying products and technologies of prevalent consumer companies like Apple and Google. Tech-savvy employees don’t want to wait for their own companies to get with the times and upgrade legacy apps. That is, they are anything but passive. They want to use the best stuff at work (read: Apple’s) because they already use these products at home. Employees can clamor for new toys and devices, but if the company doesn’t have the money to spend, those cries will go unheeded. Fortunately for Apple, IT budgets have stabilized after years of contracting, allowing organizations to purchase Apple’s more expensive wares. And this is just the tip of the iceberg, as evinced by the recent rise in Apple’s stock. We are seeing nothing less than the Applefication of the Enterprise.
What say you?
In the next installment of this series, I’ll be providing an example of one of the many organizations in dire need of Applefication.
Photo Credit: » Zitona «
In Part I of this series, I provided an introduction to this nascent trend.
Steve Jobs despised focus groups. He was famous for not listening to what customers wanted, but telling them what they wanted. Today, Applephiles are taking a cue from their iconic leader and are increasingly doing the same thing at work. Technology at work is often unnecessarily complicated and they know that there’s a better, simpler way. They’re mad as hell and they’re not gonna take it anymore.
The key question is not whether Apple products can be tweaked to meet the current and complex needs of large, sophisticated organizations and their IT departments. That’s so 1999. Rather, Apple products and evangelists are forcing executives to ask themselves why their technology needs are so complex in the first place. Why do they need to maintain these bloated applications and byzantine requirements? And how everything can be dramatically simplified?
And here’s the funny thing. It’s actually working. Jobs is starting to change the very DNA of IT departments from the grave. Case in point: In the last six months, Apple’s enterprise sales have exploded almost by accident. This explosion is even more remarkable when you consider that historically Apple has not chased corporate clients at all. In this vein, the company is the antithesis of Dell, IBM, and Microsoft.
We have known for a while now that Apple makes the best technology products. Many companies are finally taking notice, especially when the CIO gets it. Progressive CIOs know that Apple is “where it’s at”, even if their kids had to tell them as much. Many CIOs are making top-down decisions about what’s best for their companies—and increasingly that decision has Apple written all over it.
But the Applefication of the Enterprise cannot be attributed to a few hip CIOs getting Apple religion. Remember that large organizations move much slower than startups. To find out why Apple is succeeding in this new arena, we have to delve a little deeper. When we do, we see that there’s a major shift taking place in many organizations, something fundamental, organic, and employee-driven.
The Consumerization of IT
In the 1990s, the vast majority of people used the best available technology at work. Period. Relatively few people bought computers, programs, and devices whose power and functionality surpassed the ones they used in the office. But that was then; this is now.
Today, we have entered Bizarro World. The landscape could not be more different than even a decade ago. For millions of people, the quality, functionality, and power of the devices and technology they use at home today often exceeds their counterparts at work. What’s more, we are constantly using a wide array of devices and apps well after we leave the office. The distinction between “at home” and “at work” has blurred, if not become meaningless. One can just about work anywhere these days, spawning the terms virtual or distributed companies. And the very notion of a computer is changing before our eyes. Smartphones and tablets now compete with desktops and laptops as the primary means by which many people connect to the Internet and do much of their work.
Faced with technologically deficient workplace devices, technologies, and applications, many employees are refusing to compromise. Why should it take then five times as long to do something at work than it takes at home? Restrictions on the size of email attachments? I’ll just use DropBox or YouSendIt. IT is blocking YouTube via Websense? Then I’ll just watch it on my iPhone.
Employees are demanding better products and services than those provided by their own companies—and many CIOs are listening. Worker bees are complaining to their managers or just using their own devices and software to get things done. After all, it’s pretty hard to ban an employee from bringing her iPhone into work and using it.
What say you?
In the next installment of this series, I’ll be delving deeper into the consumer IT revolution as it pertains to Applefication of the Enterprise.
Photo Credit: » Zitona «
“Simplicity is the ultimate sophistication.”
–Leonardo da Vinci
Author’s note: In a series of posts over the next few months, I’ll be delving into a nascent trend: the Applefication of the Enterprise. Today’s introductory post lays a bit of the groundwork for the series.
In early February of 2012, Halliburton, one of the world’s largest oilfield service companies, became the latest enterprise to abandon RIM’s BlackBerry. Halliburton’s new smartphone of choice: Apple’s iPhone.
Even two years ago, this would have been earth-shattering news. Companies of this size just didn’t buy Apple products. These days, however, announcements like these have almost become commonplace. That is, Halliburton is hardly alone in adopting the Apple’s iPhone throughout the company. In late 2011, Pfizer announced that it will purchase a rumored 37,000 iPads for its scientists and sales and manufacturing employees. In the same year, biotech giant Genentech announced that it had rolled out 30 company-specific apps in its own private app store.
Government Goes Apple Too
If you think that this trend is limited to the private sector companies rife with cash, think again. In mid-February of 2012, another government institution went Apple. As David Zax writes, “the National Oceanic and Atmospheric Administration (NOAA) is throwing their BlackBerrys overboard, opting for Apple products instead. Though NOAA already put some 3,000 BlackBerry devices in circulation among its 20,000 workers, it will only be supporting the devices until May 12. NOAA CIO Joe Klimavicz cited the cost of Research in Motion’s software as the chief reason for the switch.” The cash-strapped public sector is realizing that Apple products are not only just cooler; they actually may be cost-efficient relative to existing applications and devices.
The reasons for these moves aren’t terribly difficult to understand. “With a relatively small investment, companies can re-create the whole information-on-the-fly scenario that was nearly impossible before”, says Pierfrancesco Manenti, an analyst at information technology outfit IDC. More large organizations will doubtless follow the mass exodus away from once omnipresent BlackBerries. PCs and laptops will give way to iPads. Apps will continue to supplant many complex software programs in new and exciting ways in enterprises across the globe.
The key point of these stories is not the demise of any one tool like the BlackBerry, a product whose rate of innovation clearly has trailed that of its competitors. Rather, the story illustrates a new mind-set and the transformative power of Apple and its products. They are no longer merely the solve purview of stylish consumers, small design firms, and niche start-ups. Increasingly, Apple products are now accepted as real products in real companies that solve real problems.
Definition: The Applefication of the Enterprise
The Applefication of the Enterprise is not just about enterprises purchasing and deploying Apple products. Rather, it’s about what companies like Apple and Google represent: simplicity, ease of use, self-service and rapid deployment. Every organization will not use Apple products—and Apple couldn’t produce enough iPads, iPhones, and MacBooks even if every last CIO signed up. Rather, Apple is focusing all organizations to reevaluate their existing technologies, applications, and infrastructure with the intent of making them simpler, more user-friendly, more Apple. In short, Apple is causing large organizations to think different.
What say you?
In the next installment of this series, I’ll be taking a look at some of the factors driving the Applefication of the Enterprise.
Back when MySpace, AOL, and Yahoo! ruled the world, people online were not always who they appeared to be. Yes, the Internet was still shaking out, but these erstwhile titans did not exactly take pains to authenticate their users. The dot-com era rewarded eyeballs, clicks, and page views–not authenticity.
A New Era
Fast forward ten years. Those three companies are shells of their former selves. Screen names like TennisFan_69 have given way to real names at companies that understand the importance of validating user identifies. While forgeries are nearly possible to completely prevent, current tech bellwethers like Twitter, Google (via Plus), LinkedIn, and Facebook make great efforts to ensure that people are who they claim to be. (By extension, sites that use tools like Facebook Connect benefit from these authentication steps.)
The point is that millions of people can effectively manage their own identities, their own data, much better than a centralized entity or a customer service department.
This is one end of the spectrum: the democratization of data. As Clive Thompson writes in Wired, we’ve seen this era of increased transparency play out among our very eyes over the last five years, although sites like eBay and Amazon have long enabled this type of data self-service. Thanks to Google, it’s harder than ever to pretend that you’re someone else. Ask Scott Adams–or at least one of his pseudonyms.
Let’s switch gears.
Contrast the “all hands on deck approach to data management” with what many small business owners have to face. Their data tends to be extremely accurate because so few hands are touching it. While far from perfect, at least errors tend to be consistently made. Rarely in my experience are 20 different people at a company entering hours, invoices, or purchase orders in 20 different ways. Mistakes can typically be rectified in a relatively short period of time after someone understands what was done.
To sum, when millions of people touch the data, the result tends to be the same: reasonably good data.
The problem for most organizations lies somewhere in between these two extremes. When 50 or 200 or 1,000 people touch the data, things often go awry (absent some type of data quality tool, culture of data governance, routine audits, and the like). Data is often, incomplete, inaccurate, dated, and/or duplicated.
Employees in big companies rarely make errors in consistent ways–and business rules of enterprise applications can only do so much. Yes, I can prevent someone from adding an employee with the same social number, but does the busy data entry clerk really care about data integrity when making minimum wage?
Adding to the mess is the fact that too often organizations fail to appropriately train employees. On-the-job training is, at least in my experience, sadly the norm.
You may allow vendors, customers, and even employees to manage their own information–or at least some of it. Of course, you can restrict access to editable data to only employees who have been properly trained and understand the consequences of their actions–and inactions.
Perhaps most important, however, understand that “the middle” represents a danger zone, a potential netherworld in which your data faces serious risk of being compromised.
What say you?
TODAY: Mon, April 24, 2017April2017