Posts Tagged ‘Collaboration’
Collaboration is often cited as a key success factor in many enterprise information management initiatives, such as metadata management, data quality improvement, master data management, and information governance. Yet it’s often difficult to engage individual contributors in these efforts because everyone is busy and time is a zero-sum game. However, a successful collaboration needn’t require a major time commitment from all contributors.
While a small core group of people must be assigned as full-time contributors to enterprise information management initiatives, success hinges on a large extended group of people making what Clive Thompson calls micro-contributions. In his book Smarter Than You Think: How Technology is Changing Our Minds for the Better, he explained that “though each micro-contribution is a small grain of sand, when you get thousands or millions you quickly build a beach. Micro-contributions also diversify the knowledge pool. If anyone who’s interested can briefly help out, almost everyone does, and soon the project is tapping into broad expertise.”
Wikipedia is a great example since anyone can click on the edit tab of an article and become a contributor. “The most common edit on Wikipedia,” Thompson explained, “is someone changing a word or phrase: a teensy contribution, truly a grain of sand. Yet Wikipedia also relies on a small core of heavily involved contributors. Indeed, if you look at the number of really active contributors, the ones who make more than a hundred edits a month, there are not quite thirty-five hundred. If you drill down to the really committed folks—the administrators who deal with vandalism, among other things—there are only six or seven hundred active ones. Wikipedia contributions form a classic long-tail distribution, with a small passionate bunch at one end, followed by a line of grain-of-sand contributors that fades off over the horizon. These hardcore and lightweight contributors form a symbiotic whole. Without the micro-contributors, Wikipedia wouldn’t have grown as quickly, and it would have a much more narrow knowledge base.”
MIKE2.0 is another great example since it’s a collaborative community of information management professionals contributing their knowledge and experience. While MIKE2.0 has a small group of core contributors, micro-contributions improve the breadth and depth of its open source delivery framework for enterprise information management.
The business, data, and technical knowledge about the end-to-end process of how information is being developed and used within your organization is not known by any one individual. It is spread throughout your enterprise. A collaborative effort is needed to make sure that important details are not missed—details that determine the success or failure of your enterprise information management initiative. Therefore, be sure to tap into the distributed knowledge of your enterprise by enabling and encouraging micro-contributions. Micro-contributions form a collaboration macro. Just as a computer macro is comprised of a set of instructions that are used to collectively perform a particular task, think of collaboration as a macro that is comprised of a set of micro-contributions that collectively manage your enterprise information.
On the recent Stuff to Blow Your mind podcast episode Outsourcing Memory, hosts Julie Douglas and Robert Lamb discussed how, from remembering phone numbers to relying on spellcheckers, we’re allocating our cognitive processes to the cloud.
“Have you ever tried to recall an actual phone number stored in your cellphone, say of a close friend or relative, and been unable to do so?” Douglas asked. She remarked how that question would have been ridiculous ten years ago, but nowadays most of us would have to admit that the answer is yes. Remembering phone numbers is just one example of how we are outsourcing our memory. Another is spelling. “Sometimes I find myself intentionally misspelling a word to make sure the application I am using is running a spellchecker,” Lamb remarked. Once confirmed, he writes without worrying about misspellings since the spellchecker will catch them. I have to admit that I do the same thing. In fact, while writing this paragraph I misspelled several words without worry since they were automatically caught by those all-too-familiar red-dotted underlines. (Don’t forget, however, that spellcheckers don’t check for contextual accuracy.)
Transactive Memory and Collaborative Remembering
Douglas referenced the psychological concept of transactive memory, where groups collectively store and retrieve knowledge. This provides members with more and better knowledge than any individual could build on their own. Lamb referenced cognitive experimental research on collaborative remembering. This allows a group to recall information that its individual members had forgotten.
The memory management model of what we now call the cloud is transactive memory and collaborative remembering on a massive scale. It has pervaded most aspects of our personal and professional lives. Douglas and Lamb contemplated both its positive and negative aspects. Many of the latter resonated with points I made in my previous post about Automation and the Danger of Lost Knowledge.
Free Your Mind
In a sense, outsourcing our memory to the cloud frees up our minds. It is reminiscent of Albert Einstein remarking that he didn’t need to remember basic mathematical equations since he could just look them up in a book when he needed them. Nowadays he would just look them up on Google or Wikipedia (or MIKE2.0 if, for example, he needed a formula for calculating the economic value of information). Not bothering to remember basic mathematical equations freed up Einstein’s mind for his thought experiments, allowing him to contemplate groundbreaking ideas like the theory of relativity.
Forgetting how to Remember
I can’t help but wonder what our memory will be like ten years from now after we have outsourced even more of it to the cloud. Today, we don’t have to remember phone numbers or how to spell. Ten years from now, we might not have to remember names or how to count.
Wearable technology, like Google Glass or Narrative Clip, will allow us to have an artificial photographic memory. Lifelogging will allow us to record our own digital autobiography. “We have all forgot more than we remember,” Thomas Fuller wrote in the 18th century. If before the end of the 21st century we don’t have to remember anything, perhaps we will start forgetting how to remember.
I guess we will just have to hope that a few trustworthy people remember how to keep the cloud working.
While security and privacy issues prevent sensitive data from being shared (e.g., customer data containing personal financial information or patient data containing personal health information), do you have access to data that would be more valuable if you shared it with the rest of your organization—or perhaps the rest of the world?
We are all familiar with the opposite of data sharing within an organization—data silos. Somewhat ironically, many data silos start with data that was designed to be shared with the entire organization (e.g., from an enterprise data warehouse), but was then replicated and customized in order to satisfy the particular needs of a tactical project or strategic initiative. This customized data often becomes obsolesced after the conclusion (or abandonment) of its project or initiative.
Data silos are usually denounced as evil, but the real question is whether the data hoarded within a silo is sharable—is it something usable by the rest of the organization, which may be redundantly storing and maintaining their own private copies of the same data, or are the contents of the data silo something only one business unit uses (or is allowed to access in the case of sensitive data).
Most people decry data silos as the bane of successful enterprise data management—until you expand the scope of data beyond the walls of the organization, where the enterprise’s single version of the truth becomes a cherished data asset (i.e., an organizational super silo) intentionally siloed from the versions of the truth maintained within other organizations, especially competitors.
We need to stop needlessly replicating and customizing data—and start reusing and sharing data.
Historically, replication and customization had two primary causes:
- Limitations in technology (storage, access speed, processing speed, and a truly sharable infrastructure like the Internet) meant that the only option was to create and maintain an internal copy of all data.
- Proprietary formats and customized (and also proprietary) versions of common data was viewed as a competitive differentiation—even before the recent dawn of the realization that data is a corporate asset.
Hoarding data in a proprietary format and viewing “our private knowledge is our power” must be replaced with shared data in an open format and viewing “our shared knowledge empowers us all.”
This is an easier mantra to recite than it is to realize, not only within an organization or industry, but even more so across organizations and industries. However, one of the major paradigm shifts of 21st century data management is making more data publicly available, following open standards (such as MIKE2.0) and using unambiguous definitions so data can be easily understood and reused.
Of course, data privacy still requires sensitive data not be shared without consent, and competitive differentiation still requires intellectual property not be shared outside the organization. But this still leaves a vast amount of data, which if shared, could benefit our increasingly hyper-connected world where most of the boundaries that used to separate us are becoming more virtual every day. Some examples of this were made in the recent blog post shared by Henrik Liliendahl Sørensen about Winning by Sharing Data.
In Why Does E=mc2? (And Why Should We Care?), Brian Cox and Jeff Forshaw explained “the energy released in chemical reactions has been the primary source of power for our civilization since prehistoric times. The amount of energy that can be liberated for a given amount of coal, oil, or hydrogen is at the most fundamental level determined by the strength of the electromagnetic force, since it’s this force that determines the strength of the bonds between atoms and molecules that are broken and reformed in chemical reactions. However, there’s another force of nature that offers the potential to deliver vastly more energy for a given amount of fuel, simply because it’s much stronger.”
That other force of nature is nuclear fusion, which refers to any process that releases energy as a result of fusing together two or more nuclei. “Deep inside the atom lies the nucleus—a bunch of protons and neutrons stuck together by the glue of the strong nuclear force. Being glued together, it takes effort to pull a nucleus apart and its mass is therefore smaller, not bigger, than the sum of the mass of its individual proton and neutron parts. In contrast to the energy released in a chemical reaction, which is a result of electromagnetic force, the strong nuclear force generates a huge binding energy. The energy released in a nuclear reaction is typically a million times the energy released in a chemical reaction.”
We often ignore the psychology of collaboration when we say that a collaborative team, working on initiatives such as information governance, is bigger than the sum of its individual contributors.
“The reason that fusion doesn’t happen all the time in our everyday experience,” Cox and Forshaw explained, “is that, because the strong force operates only over short distances, it only kicks in when the constituents are very close together. But it is not easy to push protons together to that distance because of their electromagnetic repulsion.”
Quite often the reason successful collaboration doesn’t happen is that the algebra of collaboration also requires the collaborators subtract something from the equation—their egos, which generate a strong ego-magnetic repulsion making it far from easy to bind the collaborative team together.
Cox and Forshaw explained it’s because of the equivalence of mass and energy that a loss of mass manifests itself as energy. If we jettison the mass of our egos when forming the bonds of collaboration, then we is smaller than the sum of its me parts, and that loss of me-mass will manifest itself as the we-energy we need to bind our collaborative teams together.
In my previous post, I discussed the psychology of collaboration, focusing on a few psychological concepts (social loafing and availability bias) that undermine, or at least greatly diminish the effects of, collaborative efforts, and explaining why a better understanding of these psychological concepts can help you better manage your collaborative teams for ongoing success.
In this post, let’s examine the tension that exists between individual contributors and collaboration.
Individual Contributors are Not Heroes
Individual contributors often don’t like to collaborate because they want to be in control, they want to be the hero. However, you need to make it clear to individual contributors that they are not heroic for deliberately choosing to not collaborate. But since there will always be some tasks that can be best handled by individual contributors, putting them to good use can be beneficial.
Therefore, what individual contributors are assigned to work on, and the recognition and rewards they will receive, must be well-communicated, both to them as well as the team that is directly collaborating. The worst problem for collaboration is when an individual contributor either pretends to be a team player or deludes themselves into believing they are.
Individual contributors are focused on individual success, which is about competition, not collaboration. Of course, competition isn’t always a bad thing, it’s only bad when individual success comes at a great price to others. Collaboration means that sometimes you have to do something you don’t want to do, or you don’t agree with. Sometimes, for the team to win, you have to let someone else be the hero.
As Arthur Ashe explained, “True heroism is remarkably sober, very undramatic. It is not the urge to surpass all others at whatever cost, but the urge to serve others at whatever cost.”
Going from “I’m great” to “We’re great”
Ultimately, for an organization to embrace a collaborative culture, there needs to be a paradigm shift that I have previously blogged about as turning the M upside down — turning Me into We.
In their great book Tribal Leadership: Leveraging Natural Groups to Build a Thriving Organization, Dave Logan, John King, and Halee Fischer-Wright describe this paradigm shift as moving from a Stage Three (“I’m great”) to a Stage Four (“We’re great”) organization.
“People at Stage Three have to win, and for them winning is personal. They’ll outwork and outthink their competitors on an individual basis. The mood that results is a collection of lone warriors, wanting help and support and being continually disappointed that others don’t have their ambition or skill. Because they have to do the tough work (remembering that others just aren’t as savvy), their complaint is that they don’t have enough time or competent support.”
By contrast, people at Stage Four understand ubuntu, which is a word from the Bantu languages of southern Africa, which can be translated into English as “I am what I am because of who we all are.”
According to Desmond Tutu, “ubuntu is the essence of being human. Ubuntu speaks particularly about the fact that you can’t exist as a human being in isolation. It speaks about our interconnectedness. We think of ourselves far too frequently as just individuals, separated from one another, whereas you are connected, and what you do affects the whole world.”
Collaboration Causes All Success
We think of ourselves far too frequently as individual contributors, separated from one another, whereas we are connected, and what we do affects the whole organization. This is why collaboration, whether it’s direct or indirect, is what causes all of the success enjoyed by the organization.
A lot of collaboration is indirect, meaning you’re often supported by the efforts of people you don’t interact with on a daily basis, including people who (or whose work) you didn’t even know existed.
Nonetheless, since the organization is deeply interconnected, even what appears to be a success caused by an individual contribution was, in fact, a success only possible because of collaboration.
I’ve always admired the collaborative community for information management professionals that MIKE2.0 is creating, so I figured my first post here should be about collaboration, especially since many information management initiatives can not function properly without it. For example, as was discussed during Episode 02 of the Open MIKE Podcast, a collaborative approach is essential to successful information governance.
In this post, I want to focus on a few psychological concepts that can undermine collaborative efforts.
In his book You Are Not So Smart, David McRaney explained how “in 1974, psychologist Alan Ingham had people put on a blindfold and grab a rope. The rope was attached to a contraption that simulated the resistance of an opposing team. The subjects were told many other people were also holding the rope on their side, and he measured their effort. Then, he told them they would be pulling alone, and again he measured. They were alone both times, but when they thought they were in a group, they pulled 18 percent less strenuously on average.”
This is sometimes called the Ringelmann effect after French engineer Maximilien Ringelmann, who discovered in 1913 that if he had people get together in groups to pull on a strain gauge, their combined efforts would tally up to less than the sum of their individual strength measurements.
“Ingham and Ringelmann’s work,” McRaney concluded, “introduced social loafing to psychology: You put in less effort when in a group than you would if working alone on the same project.”
Collaboration is about leveraging a team to collectively tackle information management tasks, so that the team’s results exceed those of the best individual contributors. Although this is usually true, over time social loafing could start to diminish the effects of collaborative efforts.
Therefore, it’s important to make sure the team understands that they have a collective ownership and a shared responsibility for achieving their goals, but that individuals are accountable for specific roles.
In his book, Thinking, Fast and Slow, Daniel Kahneman explained that “many members of a collaborative team feel they have done more than their share and also feel that the others are not adequately grateful for their individual contributions.”
But if you asked each individual to assess their contribution as a percentage of the overall effort, the self-assessed contributions would add up to more than 100%. The reason for this is something known in psychology as availability bias, which makes people remember their own individual efforts and contributions much more clearly than those of others.
There will be times when some individuals feel like they are contributing more than their teammates, but sometimes, this will only be a misperception brought on by availability bias. Other times, it will be true simply because no collaborative effort is ever perfectly balanced, meaning sometimes sacrifices for the long-term greater good will require some individuals put in more effort in the short-term. As long as that’s the exception, not the rule, then collaborative harmony can be maintained.
“You will occasionally do more than your share,” Kahneman concluded, “but it is useful to know that you are likely to have that feeling even when each member of the team feels the same way.”
Although collaboration isn’t always the best option (Phil Simon actually wrote an interesting three-part series making the case against collaboration: Part 1, Part 2, Part 3), with a better understanding of the psychology of collaboration, you can better manage your collaborative teams for ongoing success.
If you’re a Chief Information Officer (CIO) there are three things that your organization expects of you: 1) keep everything running; 2) add new capabilities; and 3) do it all as cheaply as possible. The metrics that CIOs typically use to measure these things include keeping a count of the number of outages, number of projects delivered and budget variances. The problem with this approach is that it fails to take account of complexity.
When I talk with senior executives, regardless of their role, the conversation inevitably turns to the frustration that they feel about the cost and complexity of doing even seemingly simple things such as preparing a marketing campaign, adding a self-service capability or combining two services into one. No matter which way you look at it, it costs more to add or change even simple things in organisations due to the increasing complexity that a generation of projects have left behind as their legacy. It should come as no surprise that innovation seems to come from greenfield startups, many of which have been funded by established companies who’s own legacy stymies experimentation and agility.
This doesn’t have to be the case. If a CIO accepts the assertion that complexity caused by the legacy of previous projects is the enemy of agility, then they should ask whether they are explicitly measuring the complexity that current and future projects are adding to the enterprise. CIOs need to challenge themselves to engineer their business in such a way that it is both flexible and agile with minimal complexity compared to a startup.
The reason that I spend so much time writing about information rather than processes or business functions is that the modern enterprise is driven by information. So much so, that metrics tracking the management and use of information are very effective predictors of issues with the development of new processes and obstacles to the delivery of new functionality. There appear to be no accurate measures of the agility of enterprise technology that focus on just processes or functions without information but there are measures of information that safely ignore processes and functions knowing that well organized information assets enable new processes and functions to be created with ease.
The CIO who wants to ensure their organisation has the capability to easily implement new functions in the future should look to measure how much information the organization can handle without introducing disproportionate complexity. The key is in structuring information assets in such a way as to ensure that complexity is compartmentalized within tightly controlled units with well understood boundaries and properly defined interfaces to other information assets. These interfaces act as dampeners of complexity or turbulence, allowing for problems or changes to be constrained and their wider impact minimized.
Creating such siloes may seem to go against conventional wisdom of having an integrated atomic layer and enterprise approach to data. Nothing could be further from the truth, it is simply ensuring that the largest possible quantity of information is made available to all stakeholders with the minimum possible complexity.
The actual measures themselves are described in my book, Information-Driven Business, as the “small worlds data measure” for complexity and “information entropy” for the quantity. Applying the measures is surprisingly easy, the question each CIO then needs to answer is how to describe these measures in a way that engages their business stakeholders. If technology leaders are hoping to avoid difficult topics with their executive counterparts then this will be impossible, but if they are willing to share their inside knowledge on modern information technology then the “us and them” culture can start to be broken down.
A few months ago, I wrote a three part series on the case against collaboration. In it, I detailed times in which working with others was ill-advised. Of course, I’m a happy collaborator when it makes sense. This is an important distinction. From my perspective, many organizations lack an effective tool to encourage effective collaboration. As a result, the old standby (email) becomes the sole and de facto collaboration tool.
And this is a big mistake. In this post, I am going to explain why.
Email and Collaboration
Email may in fact be the killer app from Enterprise 1.0. Think about how we used to communicate in professional settings. Intraoffice memos, posts on community bulletin boards, and letters represented the primary written means by which most people exchanged information. This wasn’t exactly ideal, but we really had no alternative at the time.
And then came email. It was
…was widely accepted by the business community as the first broad electronic communication medium and was the first ‘e-revolution’ in business communication. Email is very simple to understand and like postal mail, email solves two basic problems of communication: logistics and synchronization (see below).
In other words, email allowed us to take a giant leap forward, enabling near real-time communication. The speed and efficiency of the medium was unparalleled. Many people (myself included) probably relied upon emails too much.
Well, no longer is email the sole means to communicate. And it’s about time that many people and organizations came around to this fact.
Problems with Relying Exclusively on Email
There are obvious issues with email not capturing the tone of the sender. How many times has a message been misinterpreted over email? There also are more technical issues with servers occasionally taking awhile to process individual messages, leading to the often-asked question, “Didn’t you get my email?”
Perhaps my biggest pet peeve: email just is not efficient.
On a recent client engagement, the organization had no collaborative tool. I would routinely receive files as attachments, often with vague instructions about what I need to do with them. Of course, on data management and application development projects, vagueness is just plain evil. On occasion, I would misinterpret instructions based on inaccurate assumptions. Or perhaps I would receive the message correctly and begin a major change. Half an hour later, I would receive an email with a new file, formatted in a slightly different way. In a few cases, I had to junk my previous modifications.
Contrast that with what I–and many others–have: a basic screen-sharing app. There are many, but I use GoToMyPC. I can show my client in real-time how I am doing something. I can ask questions and receive immediate feedback, minimizing cost and development time. My client can ask questions and answer in turn. What’s more, I can expedite knowledge transfer and root out issues quicker than by going back-and-forth via e-mail.
Will e-mail ever die? I don’t see it happening, at least anytime soon. However, I can certainly see it being used less as different communication methods mature, evolve, and are introduced. Rather than rely exclusively on what we know best, intelligent folks and organizations will continue to push the envelope–improving collaboration in the process.
What say you?
It’s funny how old habits die hard. Consider the following:
- I can’t seem to wean some of my tech-savvy friends off email when they should just know better.
- My dad still checks his stock quotes on television.
- Many are using old versions of Microsoft Excel or Project to manage their enterprise-wide system deployments, upgrades, and other information management (IM) endeavors.
To be sure, each of these tools was powerful “back in the day” and, truth be told, they still can get the job done. Sort of. With regard to MS Project and Excel, even “older” versions contain essential functionality when it comes to tracking resources, producing reports, etc.
Old School Tools
But there’s always been something that’s frustrated me about these applications. (No, not that Microsoft makes them. I’m a big MS guy. We can discuss the reasons over beers sometime.) Rather, these tools just didn’t allow for easy collaboration, something part and parcel of the MIKE2.0 framework. Typically, one person held the “master” file (.XLS, MPP) and either kept it close to his/her vest or sent it around to team members for updates.
The following process wasn’t terribly uncommon:
- Team member sits down and manually update tasks on massive project plans
- Team member sends said updates to some type of PM or administrator
- Invariably, t eam member needs to explains that updates are not in sync anymore
- Repeat above process every week or so
It just seems so 1990s…
Now, Office 2010 with its web-based front end may address the linear nature of updates to each tool. To be honest, however, I’m too lazy to look it up right now–and Project isn’t part of the main Office suite. There’s no way that I’m going to be an early adopter for the 2010 version. I’m still shell-shocked over the drastic GUI change from Office 2003 to 2007.
Also, note that this post only uses a particular product (Comapping) as an example of new PM tools and a new mindset. I could just have easily picked another tool.
Project Management 2.0
There are many new web-based apps out there to manage large projects. From what I can tell, they do a much better job than the relics of years’ past. I’ve kicked the tires on a few and been fairly impressed. I recently spoke with John Kyle at APE Software and he mentioned that his organization uses Comapping. I started playing around with it and created this completely incomplete “plan.”
The visual and collaborative nature of the tool just blew me away. Yes, you can share your screen with others and “co-create” a plan or make changes in real time. There are other really neat features that, from my perspective, would facilitate collaboration, communication, and effective project management. These are all admirable goals.
Tools alone don’t ensure that a project will come in under budget and at or ahead of schedule. Many things can still derail projects of all sizes, scopes, and sorts. No one’s disputing that. But doesn’t it stand to reason that a better PM tool (whichever you choose) will allow for collaboration? To me, Agile projects just don’t fit neatly into the Gantt Chart type of mentality.
Think of this in terms of a golf analogy. You give me the world’s best clubs and I’m not breaking 80. I’m just not that good. However, let’s say that:
- You show me how to swing properly
- I practice
- and I have a great set of clubs
Isn’t success more likely?
What do you think?
In my last two posts, I described some of the reasons that all projects and endeavors do not lend themselves to effective collaboration. Whether they deal with people or with innovation, some projects or products are best conducted with the fewest number of people possible–at least initially. In this post, I’ll discuss some of the cultural and geographic challenges associated with collaboration on IT projects.
I have worked on international IT projects and have personally seen some of the issues described in this post. Also, I have watched with great interest the rise of wikis, VOIP, instant messaging (IM), screen sharing, greatly enhanced video conferencing technologies such as Telepresence from Cisco, and other tools. Collectively, they have made collaboration easier. Never mistake easier with easy, though.
Despite the recent technological advents and improvements mentioned above, international projects are still face formidable challenges. They include differences vis-à-vis:
- time zones
Each one of these differences could be a deal breaker on a specific project. If you talk to a true expert on the subject like my friend Jason Horowitz, you’ll realize that there are simply limits to what technology can effectively overcome. Consider the following:
- While wikis may be better than long, drawn out email chains for exchanging ideas, neither captures the energy of an in-person brainstorming session.
- Being able to stop by a colleague’s cubicle to run through a scenario isn’t the same as sending her an instant message.
- Even those who speak the same language may not be able to pick up on non-verbal cues. Consider that upwards of 90 percent of all communication is non-verbal.
- While banal, sometimes the very act of scheduling a conference call can be a struggle with workers strewn all across the globe.
I could continue, but the fact remains: there are still major hurdles to overcome on multinational, multi-party information management (IM) projects. These hurdles can become insurmountable factoring in two additional obstacles.
Incomplete Specifications and the Limitations of Outsourcing
Those who cite the benefits of outsourcing are quick to talk about money. Yes, organizations can ostensibly save considerable funds by utilizing high-tech workers in countries such as Brazil, Russia, Vietnam, and India. In some cases, compared to workers “high wage” countries such as the US, the savings can be upwards of 70 percent. But what about the drawbacks?
There can be many, as most seasoned IT practitioners know. But perhaps the most vexing on IT-related projects is the tendency for outsourced workers to “write to spec.” This is tech-speak for writing code, reports, or software exactly as detailed in the specification. This problem wouldn’t be as inimical if specs were bullet-proof. At least in my experience, this is rarely the case. Precious time is wasted as developers and “business folks” go round and round debating the ins and outs of individual work products. In many cases, the outsourced worker does not have sufficient “business” or product knowledge to make the logical leaps of faith required on a deficient spec. For their part, those requesting the work typically cannot communicate in a language that the techies can fully understand.
And let’s not forget the elephant in the room:
Sometimes proper specs are completely missing altogether.
I’ve worked for months at client sites on development projects without receiving a single proper spec. I’ve been able to muddle my way though, but I’d be lying if I claimed to be a mind reader. My batting average was certainly not 1.000.
For a more robust description of the issues related to outsourcing, click here.
Look, no one is saying that international projects and collaboration are mutually exclusive. They’re not. Nor am I implying that organizations should never outsource. My only point is that all of the wikis, teleconferences, and meetings in the world can necessarily overcome many of the challenges associated with different time zones, languages, and cultures. Know this going in to your next IM project.
What do you think?
TODAY: Mon, April 24, 2017April2017