I’ve always admired the collaborative community for information management professionals that MIKE2.0 is creating, so I figured my first post here should be about collaboration, especially since many information management initiatives can not function properly without it. For example, as was discussed during Episode 02 of the Open MIKE Podcast, a collaborative approach is essential to successful information governance.
In this post, I want to focus on a few psychological concepts that can undermine collaborative efforts.
In his book You Are Not So Smart, David McRaney explained how “in 1974, psychologist Alan Ingham had people put on a blindfold and grab a rope. The rope was attached to a contraption that simulated the resistance of an opposing team. The subjects were told many other people were also holding the rope on their side, and he measured their effort. Then, he told them they would be pulling alone, and again he measured. They were alone both times, but when they thought they were in a group, they pulled 18 percent less strenuously on average.”
This is sometimes called the Ringelmann effect after French engineer Maximilien Ringelmann, who discovered in 1913 that if he had people get together in groups to pull on a strain gauge, their combined efforts would tally up to less than the sum of their individual strength measurements.
“Ingham and Ringelmann’s work,” McRaney concluded, “introduced social loafing to psychology: You put in less effort when in a group than you would if working alone on the same project.”
Collaboration is about leveraging a team to collectively tackle information management tasks, so that the team’s results exceed those of the best individual contributors. Although this is usually true, over time social loafing could start to diminish the effects of collaborative efforts.
Therefore, it’s important to make sure the team understands that they have a collective ownership and a shared responsibility for achieving their goals, but that individuals are accountable for specific roles.
In his book, Thinking, Fast and Slow, Daniel Kahneman explained that “many members of a collaborative team feel they have done more than their share and also feel that the others are not adequately grateful for their individual contributions.”
But if you asked each individual to assess their contribution as a percentage of the overall effort, the self-assessed contributions would add up to more than 100%. The reason for this is something known in psychology as availability bias, which makes people remember their own individual efforts and contributions much more clearly than those of others.
There will be times when some individuals feel like they are contributing more than their teammates, but sometimes, this will only be a misperception brought on by availability bias. Other times, it will be true simply because no collaborative effort is ever perfectly balanced, meaning sometimes sacrifices for the long-term greater good will require some individuals put in more effort in the short-term. As long as that’s the exception, not the rule, then collaborative harmony can be maintained.
“You will occasionally do more than your share,” Kahneman concluded, “but it is useful to know that you are likely to have that feeling even when each member of the team feels the same way.”
Although collaboration isn’t always the best option (Phil Simon actually wrote an interesting three-part series making the case against collaboration: Part 1, Part 2, Part 3), with a better understanding of the psychology of collaboration, you can better manage your collaborative teams for ongoing success.
Few companies manage information better than Google. If there were an annual award for corporate information management (IM), I’m sure that Google would have won it–or had been in the top three–over the past decade.
Why take IM so seriously? Because, quite frankly, without it, Google becomes much, much less valuable. Sans information, how does Google really help us? It helps us find what we want; Google doesn’t directly give us what we want. In other words, we don’t spend much time on google.com. We use it to go to the sites that let us buy things.
It turns out that we ain’t seen nothing yet. Google has been fine-tuning Google Now (Google Alerts on steroids.) From a recent TechCrunch article:
Google Now is a standard feature of Android Jelly Bean and up. It’s an easily accessible screen that shows you information about your daily commute (because it learns where you go every day and makes an educated guess as to where ‘home’ and ‘work’ are for you), appointments, local weather, upcoming flight and hotel reservations (assuming you give it access to scan your Gmail account) and how your favorite team did last night (it learns that from your search behavior). It also notices when you are not at home and shows you how long it’ll take you to get back to your house, or, if you are travelling, presents you with a list of nearby attractions you may be interested in, the value of the local currency, the time back home and easy access to Google Translate.
In a post-PC world, it’s not hard to understand the vast potential value of a personalized technological companion who can help you navigate an increasingly busy and complex world. With what Google knows about you via email, Web-surfing habits, social connections via Plus, and the like, Google Now may in fact be a game-changer.
Implications for Big Data
But you can only do so much on your Android device. What if you could see things? What if wearable technology and augmented reality could make your life even easier (or creepy, depending on your point of view)? Enter Google’s Project Glass, a pet project of Sergey Brin.
If you think that data is big now, get ready for Really Big Data. What if you could just think about recording a walk in Paris and publish it to YouTube in the process? What if you could review a product on Yelp by talking to yourself at the store? Perhaps speech-to-text technology would then publish that review automatically? Maybe your car will drive you to your next appointment on your Google calendar.
The implications are nearly limitless. As technology continues to evolve into heretofore “protected” areas, more and more data will be generated. Companies like Amazon, Apple, Facebook, Twitter, and Google have the compute power and storage capacity to actually do something with this data. Machine learning, text analytics, natural language processing, and other
The enabling technologies behind Big Data are getting better every day. We’re just getting started. Your organization ought to be preparing for a data-driven world right now. If not, it may very well fall and not get up.
A little over two years ago, I began what was once the unthinkable: I became a Mac guy again. After more than a decade of exclusive PC use, I became fed up with Microsoft’s products and terms of service. I made the jump.
However, buying a Mac and completely weaning myself from my PC are not one and the same. In fact, as expected, it has been a transition more than a clean break. That is, I didn’t follow Jerry Seinfeld’s Band-Aid advice.
A few legacy apps like Microsoft Access and my admittedly long-in-the-tooth accounting system forced me to straddle the fence for a few years. However, as Windows XP nears its decommission date, I am going into 2013 with the intent of being Microsoft-free.
To do this, I needed to purchase a new accounting program. By way of background, for the last ten years a “mature” accounting system called MYOB. It wasn’t the sexiest application, but it got the job done.
As I exported the data from MYOB to Quickbooks (for the Mac), I noticed that my data management habits weren’t exactly perfect over the past decade. (Nothing major, but a few things annoyed me in my quest for data perfection.) In a few cases, I had duplicate vendor records. Some of my customer master information was incomplete.
What to do? I spent some time in Excel doing some “winter data cleaning.” I considered the following questions:
What better time to cleanse this data than now? (I’ve said many times that new system implementations represent opportune times to clean things up.)
Why not purge records vendors and customers with which I have had no contact in the last five years? (For instance, I no longer pay the same electric and cable companies that I did while living in New York and New Jersey.)
Why not start life with Quickbooks as cleanly as possible?
I had no one else to blame. “Simon, Inc.” is a very small shop and I do all of my own bookkeeping. Still, the way that I do my books has slightly changed over the last decade.
Simon Says: Be Your Own Chief Data Officer
I’ve written before on this site about the role of the chief data officer (CDO). It was high time that I took my own advice. While this small business example might lack the nuance of a large organization, I’d argue that the same principle applies. It’s my data and I alone take responsibility for it. Why not make it as clean as possible before migrating to a new system?
In fact, I’m going to make this an annual occurrence. Cleanse what I need, purge what I don’t, and review it all.
Many of us walk around with the knowledge that we generally understand how the world works. When people ask us for our professional opinions, we’re often more than happy to oblige. Some of us are even flattered when someone wants to know what we should do when faced with uncertainty.
But do we know as much as we think we do? I have my doubts, especially after finishing Everything Is Obvious: How Common Sense Fails Us by Duncan J. Watts. It’s a thought-provoking book because it challenges readers to ask themselves, “What do we really know?” The answer, unfortunately, is not as much as we think.
Simple vs. Complex Systems
As Watts points out, part of the problem stems from the distinction between simple and complex systems. Let’s take a simple situation: the game of blackjack. I’ve been known to play a few hands, and only a fool would hit on 18 with the dealer showing a six. It’s just bad strategy. In such a simple environment, the best decision is clear.
The problem with simple systems is that they aren’t terribly representative. In complex systems like, say, the economy, myriad forces are at play, the vast majority of which are not under the control of any one person, group, department, or organization. Even the US government could not completely solve the financial crisis with a $900 billion stimulus. Bottom line: there’s only so much that any of us can do in a complex system.
And, as I turned the final pages on what is easily the best book I’ve read this year, I started to think about that big question in the context of Big Data. If we truly embrace Big Data, then we will have to question many long-held assumptions about how many things work: our jobs, our departments, our industries, and our environments. Looking at data with open eyes means that we may not like what we see, nor what that data will tell us. And that makes many of us uncomfortable. How many of us want to constantly question what we think we know?
To me, Big Data is not all about technology. Far from it. I’d argue that there’s a human element in all new technologies, and Big Data is no exception to this rule. There’s an organizational readiness component to it, as well as a personal one. Will people and organizations unaccustomed to consulting the data suddenly change their behavior? Will they be open-minded? Or will they act as if they know how things have worked, work now, and will work in the future.
It’s a big question. Big Data may prove that you don’t know as much as we think you do. What will you do then?
I recently had some palm trees put into my backyard in my Nevada home. It was a downright cool experience that required industrial cranes to life the two-ton trees above my home.
There was no damage done to my home or the driveway and nobody was injured. Everything went well. Well, almost everything. It turns out that the truck carrying these trees was delayed.
Did the drivers have hard time loading these monstrosities? No. Was the enormous truck able to snake its way into my community? Yes. So, what was the problem?
Good old human error. When I bought the trees, my friend Jeff accompanied me. Jeff knows a thing or two about landscaping and I’m anything but a palm tree expert. I paid for the trees and gave the woman at the counter my proper address. I assumed that all was good to go.
Fast forward to tree delivery day. After a few hurried calls and general wonderment about where these things were, we identified the culprit. The saleswoman wrote down Jeff’s address on the deliver-to line, not mine. She put my address in the ‘notes’ section. For their part, the delivery guys didn’t read the notes and wound up driving 60 miles out of the way.
I’ve seen many parallels between palm trees and enterprise data in my career. I’ve had users question the accuracy of my reports. I might hear things like “There’s no way that we had that many promotions last month! You’re report is wrong!”
While I’m not perfect, I would often tell the skeptical user that we should check the data in the source system. More often than not, my report was accurate but the data pulled into that report was not. Thanks to audit tables and metadata, I could typically pinpoint the time, date, and creator of the errant record.
I would then work backwards. That is, after we knew that a user made this mistake, I would ask the natural next questions:
What other mistakes did this user make?
What else do we have to clean up?
Is there a larger departmental or organizational training issue?
Couldn’t we write a business rule or audit report to prevent the recurrence of this problem?
Everyone makes mistakes, and I’m certainly no exception. The larger point here is that data matters, especially the accurate kind. One of my favorite expressions is PICNIC–aka, problem in chair, not in computer. We can do simply amazing things with Big Data, but I’ll always insist that Small Data and data quality are just as important.
With the rise of Apple’s iCloud and file-sharing solutions such as Dropbox and Sharepoint, it’s no surprise that SaaS and cloud computing solutions are increasing in popularity. From reduced costs in hosting and infrastructure to increased mobility, backups and accessibility, the benefits are profound. Need proof? A recent study by IDC estimates that spending on public IT cloud services will expand at a compound annual growth rate of 27.6%, from $21.5 billion in 2010 to $72.9 billion in 2015.
While the popularity and acceptance of cloud computing has certainly taken off, new questions are being asked regarding the security of third-party data hosting. Has the centralization of IT into a few cloud computing platforms made it easier for the “bad guys” to focus their efforts? With valuable information from all departments ranging from marketing to accounting being openly shared with another organization, are your customer preferences, past orders, mailing lists, HR and financials at an increased risk for cyber threat?
Despite this concern, an alarming study by the Ponemon Institute, which surveyed over 900 IT executives across the world, found that about half of worldwide IT organizations said that no one in their organization evaluates cloud computing providers for security. Worse yet, another half said they were pretty sure that no one in their organizations knew about every cloud computing service that end users in their company were storing data on.
Can you really rely on the cloud to keep your enterprise information safe? What information are you sharing and what steps has your organization taken to evaluate providers for security?
In part four of this series, I discussed lust from an IM standpoint. Now it’s time for pride, defined as
an inwardly directed emotion that carries two common meanings. With a negative connotation, pride refers to an inflated sense of one’s personal status or accomplishments, often used synonymously with hubris. With a positive connotation, pride refers to a satisfied sense of attachment toward one’s own or another’s choices and actions, or toward a whole group of people, and is a product of praise, independent self-reflection, or a fulfilled feeling of belonging.
Let’s apply the concept of pride to this blog.
The Downside of Pride
All too often in the corporate and IM worlds, pride rears its ugly head into things–often with disastrous results. For instance, at one organization at which I worked, a senior IT director and 20-year-veteran (Al) was quite proud of the systems and he and his team had built. “His” systems handled the organization’s payment of employee bonuses and the distribution of stock options.
To be sure, this wasn’t “rocket surgery”, but doing this across more than 60 countries wasn’t as easy as it might sound. Al was so proud of his systems that he fought efforts to deploy an enterprise-wide solution for years, erroneously telling his colleagues that COTS applications couldn’t do what his programs could.
For years, Al was able to postpone the organization’s implementation of PeopleSoft. When he couldn’t delay it any longer, Al fought successfully to have the ERP highly bastardized customized to “integrate”–and I use that term very loosely–with his systems. The end result: an absolute mess of data coupled with a spaghetti architecture. (For how architecture ought to be, click here.)
A Different Example
Of course, pride need not hamper organizations. Contrast Al with a manager I met while consulting at a VOIP company. Steve was in fact proud of the customer applications that he had built. Without question, they had served a purpose as his company more than tripled in size. As business continued to grow, it was becoming more and more apparent that, on many levels, his creations were reaching their limits. I was nervous speaking with him: I knew that I needed to have a difficult conversation with him.
To his credit, Steve didn’t let pride get in his way. Steve knew that those applications could only do so much. In his own words, he knew that he’d at one point soon need to throw the baby out with the bath water. Whether he built new systems or implemented existing ones in the future, Steve knew that his company would be better off than it was now.
So, what separated Steve and Al? After all, both were proud of what they had accomplished. First, Steve was much younger than Al. Oftentimes people near the end of their careers are concerned about their legacies; they want to leave their mark on their organizations. Beyond age, though, there’s something in the DNA of people. Some folks can be simultaneously proud of what they’ve done–and still realize the limitations of their accomplishments.
In today’s data-driven society, it’s not uncommon for businesses to rely on multiple information systems to complete their day-to-day operations. As the Marketing Director for a small consulting firm, I can think of at least five that our department uses regularly to perform basic tasks:
Customer Relationship Management System (CRM)
Content Management System (CMS)
Email Marketing Platform
Each of these systems (or subsystems, if you will) play a role in comprising our overall Marketing Information System, yet each has its own user interface with unique data fields and import/export options. Because I’ve been with the company for years, I am versed in the “what goes where” of adding and manipulating data. A new-hire, however, would not be. And as our company continues to expand, we’re faced with how best to communicate how this “system of systems” works and what data needs to go where so that reports run correctly.
In your experience, what should be done to make it more clear to end-users “what information goes where” across overlapping information systems? How can we avoid end user entry errors and ease the learning curve for new hires and temporary staff?
In my last post, I talked about greed as it relates to IM projects. Long story short, for different reasons, people actively refuse to share information, train employees, or generally cooperate with others.
…spiritual or emotional apathy, neglecting what God has spoken, and being physically and emotionally inactive. Sloth or lut can also indicate a wasting due to lack of use, concerning a person, place, thing, skill, or intangible ideal that would require maintenance, refinement, or support to continue to exist.
To be sure, on information management (IM) projects, the ultimate effects of sloth often resemble those of greed–i.e., work just doesn’t get done in a timely manner, if at all. Alternatively, work is just sloppy. However, the motivations behind sloth and greed are typically quite different.
Greed inheres a certain defiance and even anger. For instance, consider Barry, an employee who isn’t happy that his job is changing. No one asked him what he thought. Maybe he has to learn a new skill or application. Either passively or actively, Barry expresses this anger in the workplace. Take away the change in Barry’s job and he would not have been problematic.
By way of contrast, sloth lacks the same type of precursor. When sloth manifests itself, an employee doesn’t necessarily feel aggrieved. Nothing is changing with Lillian’s job and she’s actually pretty happy. Maybe her boss asked her to look into Big Data. However, for whatever reason, she just doesn’t feel like it. She’d rather play Angry Birds while no one is looking.
Now, sloth should not be mistaken for an employee with conflicting and diverging priorities. For instance, on my ERP projects in my career, I would need to meet with the Director of Finance or the Payroll Manager for different reasons. The organization was deploying a new CRM or ERP system and my job involved activating that new system. (Of course, I couldn’t do it alone.) Unfortunately, I would often have trouble scheduling time with individual clients because they often had to deal with emergencies. By definition, these issues trumped any “future plans” that I had to discuss with them. Consequently, my meetings were sometimes canceled.
This isn’t sloth; this is just reality. A problem with testing or training in a new system always loses to an immediate organizational crisis. Consultants need to get used to this. It’s an occupational hazard.
Sloth is often a function of knowledge, curiosity, and personality. Consider the following problem: similar but not identical customer or employee data from two different spreadsheets has to be married–say, 2,000 records.
Sure, there are people who believe that this has to be a manual exercise. Because of this, they just don’t feel like doing this type of monotonous work. But plenty of people are naturally curious; they know that there just has to be a better way to do this. Adventurous and inquisitive folks are rarely lazy. They either know about Excel’s VLOOKUP function. Alternatively, they will search the web or ask others if there’s a better way to marry data. JOIN statements come to mind.
Understanding sloth is the first step in preventing or minimizing it. Ignore it at your own peril.
In my last post, I discussed the impact of wrath on IM projects. Today’s topic is the second of the deadly sins: greed.
Note that greed and sloth (to be discussed in a future post) are very different sins.
Now, let’s start off by getting our terms straight. By greed, I’m talking about the need for certain employees, groups, and departments to hoard data that ought to be shared throughout the organization. These folks are keeping for themselves what others want and/or need. For instance, consider Steve, a mid-level employee at XYZ who keeps key sales or customer data in a spreadsheet or a standalone database.
Or consider ABC a company that implemented a new system that, for different reasons, was never populated with legacy data. Barbara in Payroll holds key payroll information and will not willingly provide it to Mark in Accounting.
To be sure, organizational greed is hardly confined to data. I’ve seen many employees over the years refuse to train other employees or lift a finger to help a new hire or perceived enemy. Maybe they refuse to meet with consultants hired by senior management to re-engineer a process.
Understanding the Nature of Greed
I could go on with examples but you get my drift. Almost always, greed emanates from some fundamental insecurity within the offending employee. What’s more, at the risk of getting myself in a whole heap of trouble, I’ve found that more senior employees are more likely to be greedy. Now, this is a broad generalization and certainly does not apply across the board. I’ve seen exceptions to this general rule: young employees who wouldn’t share information and near-retirement-aged folks more than happy to show others what they know.
As employees become less secure about their jobs and themselves, they naturally start to think about the future–their futures. It’s just human nature. Many people understandably don’t want to be looking for jobs today. (This feeling increases as we age, what with many familial and personal responsibilities.) We realize that the grass is not always greener. For some of us, this manifests itself in a tendency to attempt to protect our jobs, departments, budgets, fiefdoms, and headcounts–at least until the perceived threat diminishes.
But there’s a critical and countervailing force at play for the greedy: Information wants to be free. As open-source software, open APIs, and open data sources continue to sprout, people are becoming less and less tolerant of employee bottlenecks. Those who refuse to play ball may be able to temporarily stall large-scale information management projects, but eventually, by hook or by crook, those damns always break.
I wish that I had a simple solution for resolving employee-related greed issues. I don’t. Many tomes have been written about managing difficult employees. At a high level, organizations can use two well-worn tools: the carrot and the stick. Consider rewarding employees who share information and knowledge while concurrently punishing those who don’t.