Archive for the ‘Business Intelligence’ Category
The 1993 documentary The War Room tells the story of the 1992 US presidential campaign from a behind-the-scenes’ perspective. The film shows first-hand how Bill Clinton’s campaign team responded to different crises, including allegations of marital infidelity. While a bit dated today, it’s nonetheless a fascinating look into “rapid response” politics just when technology was starting to change traditional political media.
Today, we’re starting to see organizations set up their own data war rooms for essentially the same reasons: to respond to different crises and opportunities. Information Week editor Chris Murphy writes about one such company in “Why P&G CIO Is Quadrupling Analytics Expertise”:
[Procter & Gamble CIO Filippo] Passerini is investing in analytics expertise because the model for using data to run a company is changing. The old IT model was to figure out which reports people wanted, capture the data, and deliver it to the key people weeks or days after the fact. “That model is an obsolete model,” he says.
Murphy hits the nail on the head in this article. Now, let’s delve a bit depper into the need for a new model.
The Need for a New Model
There are at least three factors driving the need for a new information management (IM) model in many organizations. First, let’s look at IT track records. How many organizations invested heavily in the late 1990s and early 2000s on expensive, on-premise ERP, CRM, and BI applications–only to have these investments ultimately disappoint the vast majority of stakeholders? Now, on-premise isn’t the only option. Big Data and cloud computing are gaining traction in many organizations.
Next up: time to respond. Beyond the poor track record of many traditional IT investments, we live in different times relative to even ten years ago. Things happen so much faster today. Why? The usual supects are the explosion of mobility, broadband, tablets, and social media. Ten years ago, the old, reactive requirement-driven IM model might have made sense. Today, however, that model becoming increasingly difficult to justify. For instance, a social media mention might cause a run on products. By the time that proper requirements have been gathered, a crisis has probably exacerbated. An opportunity has probably been squandered.
Third, data analysis and manipulation tools have become much more user-friendly. Long gone are the days in which people needed a computer science or programming background to play with data. Of course, data modeling, data warehousing, and other heavy lifting necessitate more technical skills and backgrounds. But the business layperson, equipped with the right tools and a modicum of training, can easily investigate and drill down on issues related to employees, consumers, sales, and the like.
Against this new backdrop, which of the following makes more sense?
- IT analysts spending the next six weeks or months interacting with users and building reports?
- Skilled users creating their own reports, creating and interpreting their own analytics, and making business decisions with minimal IT involvement (aka, self service)?
Building a data war room is no elixir. You still have to employ people with the skills to manage your organizations data–and hold people accountable for their decisions. Further, rapid response means making decisions without all of the pertinent information. If your organization crucifies those who make logical leaps of faith (but ultimately turn out to be “wrong” in their interpretation of the data), it’s unlikely that this new model will take hold.
What say you?
It’s that time of year! The holidays are finally over and the joys of housekeeping and number crunching are upon us. How did your business end up in 2011? Above or below your expectations?
For us marketing folks, January is typically the time we square away to audit our marketing efforts from the previous year in preparation for future strategies. And the one big question that should be on everyone’s mind is, how much did those leads really cost me?
The good news is, the numbers should be pretty painless to figure out. With all the wonderful analytic and marketing intelligence provided by companies such as Google, Coremetrics, Optify, Marketo, Raven, etc, there’s no shortage of data to analyze and come up with that magic number that tells you, “was this effort really worth it?” And with the rise of online and social media marketing in the past few years, the cost to acquire good leads has reduced dramatically to the point that any company, regardless of size or budget, can easily compete in the digital marketplace.
So what’s your magic number? In order to determine this, you need to define what you consider a lead to be. Is it someone who fills out a form on your website? Is it someone who clicks on an email campaign? How do you know when your leads are qualified, unqualified, hot or cold? The answer to these questions will differ for every business, but should be defined before calculating the cost of lead acquisition. Once you’ve figured out how you define a lead, you’ll want to find out the number of total leads you acquired and break them down by marketing channel activity.
Next, you’ll want to determine the cost of your marketing channel activities. In the world of online marketing, activities will generally fall into one of 5 campaign categories: Email, Social, SEO, Paid Search or Events/Webinars. The cost of activities for each channel will generally be a combination of time plus resources.
Lastly, to determine the average cost to acquire a lead, you’ll want to divide the total cost of marketing channel activities by the total number of leads acquired from that channel. For those visual folks out there, the equation looks similar to this:
Average cost to acquire lead = Cost of activity time + resources / total leads acquired
Calculate that and tada! That is your magic number (aka the average cost of lead acquisition for that channel). If you want to know the cost of lead acquisition for all channels combined or your marketing department in general, add the total cost for each channel and divide by the total number of leads you received.
Knowing how much value you derrived from your marketing efforts is paramount in developing your marketing plan. It will enable you to determine which activities generate the highest return on your effort (and help you boost the bottom line). Sure it takes some time to figure out, but hey.. they don’t call it crunching because it’s easy!
“A good plan is like a road map: it shows the final destination and usually the best way to get there.”
–H. Stanley Judd
I recently had the opportunity to attend IBM’s Information On Demand Conference in Las Vegas, NV. At breakfast one day, Crysta Anderson of IBM, two attendees, and I talked about some BI implementations gone awry. In this post, I’ll discuss a few of the key problems that have plagued organizations spending millions of dollars on BI products–and still not seen the anticipated results.
A Familiar Scenario?
Recognize this? An organization (call it YYZ here) is mired in a sea of individual reports, overwhelmed end users, bad data, and the like. The CIO hears about a competitor successfully deploying a BI tool, generating meaningful KPIs and insights into the business. After lengthy vendor overview and procurement processes, YYZ makes the leap and rolls out its new BI toys. Two years later, YYZ has yet to see any of the expected benefits.
Many of you have seen this movie before.
So, you’re the CIO of YYZ and your own future doesn’t look terribly bright. What to do? Blame vendors if you want. Consultants also make for good whipping boys. Maybe some regulations stood in your way. And there’s always that recalcitrant department or individual employees who didn’t get with the program. But I have one simple question for you: Where did you start?
As we discussed at that table at the IOD conference, many organizations get off on the wrong foot. They begin their BI implementations from the bottom up. In other words, they take a look at the myriad reports, scripts, queries, and standalone databases used–and not used–and then attempt to recreate all or most of them in the new BI tool. Thousands of hours, hundreds of thousands of dollars, and months of employees’ time are spent replicating the old in the new.
The Wrong Questions
There are two major problem with the bottom-up approach. For one and most obvious, not all reports are needed. The core–and faulty assumption–is that we need tomorrow what we needed yesterday. As a consultant, I personally have seen many people request reports in the “new” system that they would never run. When I would ask a question about the purpose of the report or the ultimate destination, I would far too often hear crickets.
Beyond superfluous reports, there’s a deeper problem with bottom-up BI: it runs counter the whole notion of BI in the first place.
Let me explain. At its highest level, cubes, OLAP, data mining, and other sophisticated analytical techniques and tools all start at the top. That is, the conversation typically begins with high-level questions such as:
- Who are our best customers?
- Where are our profits coming from?
- Which products are selling best in each area?
- Which salespeople seem to be struggling?
An organization cannot start a BI implementation in the weeds and expect to be successful. It needs to start with the right questions, define its terms, and clean up its data. Only then can it begin to see the fruits from the BI tree.
Fight the urge to “just do it” when deploying BI tools. Failing to reassess core business processes, KPIs and metrics, and data quality issues are all recipes for disaster. Consultants and vendors who believe that you can just figure these things out mid-stream should be ignored.
What say you?
I was watching BloombergWest the other day when John Battelle appeared on my screen. For those of you who don’t know, Battelle wears a number of impressive hats. While not writing, he chairs Federated Media Publishing. He is also a visiting professor of journalism at the University of California, Berkeley. In short, he knows what he’s talking about.
Battelle was discussing the evolution of all things technology and, in particular, the movement away from the PC to mobile devices. He also mentioned something called The Data Frame, the theme from the forthcoming Web 2.0 summit. Battelle explains what he means:
For 2011, our theme is “The Data Frame” – focusing on the impact of data in today’s networked economy. We live in a world clothed in data, and as we interact with it, we create more – data is not only the web’s core resource, it is at once both renewable and boundless. [Emphasis mine.]
Consumers now create and consume extraordinary amounts of data. Hundreds of millions of mobile phones weave infinite tapestries of data, in real time. Each purchase, search, status update, and check-in layers our world with more of it. How our industries respond to this opportunity will define not only success and failure in the networked economy, but also the future texture of our culture. And as we’re already seeing, these interactions raise complicated questions of consumer privacy, corporate trust, and our governments’ approach to balancing the two.
Is Battelle ultimately right? I tend to think so. But, beyond that, as I listened to Battelle and researched his notion of The Data Frame, one thing struck me:
Most organizations are under- or unprepared for it.
Now, there are two parts to this fundamental lack of preparation, both of which I’ll discuss in this post.
Particularly in large, conservative organizations, far too often many people don’t think of things in terms of data and information–and this is the most significant problem. Decision makers fail to realize that everything is data. Decisions are made often by gut feel, despite the fact that for years decision analysis tools have existed to assist people in making superior choices. How many people do you know with access to sophisticated BI applications who continue to rely upon Microsoft Excel?
Lamentably, many organizations have yet to get their arms around the web and its implications. Legacy systems still abound and rare is the organization that has completely embraced Enterprise 2.0 and its components, including–and arguably most important–cloud computing.
The bottom line is that not enough people think in terms of data, a limitation that invariably influences the choice of which technologies are–and are not–deployed within organizations. While many in old-school enterprises debate what to do and how to do it, the chasm between them and companies that do get it (read: Amazon, Apple, Facebook, and Google) widens. The latter companies are so valuable and admired today because they are building and deploying sticky, integrated, and data-gathering planks and platforms. They’re not just trying to “get” the web. They did that a long time ago.
The Explosion of Mobility
The web has been here in full force for nearly two decades, but enterprise mobility is a much more recent advent. While a few companies have experimented with internal Apple-like App Stores, these are the exceptions that prove the rule. For this reason, consumers and consumer-based companies–not enterprise IT departments–are leading the current technology revolution. This is in stark contrast to what I call Enterprise 1.0 in The Next Wage of Technologies. In the 1990s, people walked into the office to use the most powerful technology. These days, however, the opposite is often true: many people have more powerful devices on their hips than on their desktops.
Once again, it’s all about the people. It is incumbent upon the powers-that-be to fundamentally alter their mindsets. Data need not be a “icky” problem to manage. On the contrary, it represents myriad opportunities to recognize and harvest. Once change- and risk-averse executives realize this, they can implement the apps, data models, and the like necessary to survive in our dynamic world.
What say you?
I know a guy who works for a large retail organization (call it ABC here) in an information management capacity. Let’s call him James in this post. Over the last 12 years, James has ascended to a position of relative prominence at ABC. (He’s one of maybe ten assistant vice presidents.) He manages five people directly and is responsible for a number of contractors overseas. He can still build cubes, roll up his sleeves to solve vexing data-oriented issues, and talk the talk.
Beyond technical skills, James plays by his company’s rules and never rocks the boat. He listens very intently when his internal clients talk to him about their needs, frustrations, and suggestions. James is incredibly diplomatic and has rarely offended anyone, even during organizational crises. The word diligent is entirely appropriate to describe him. He’s a real asset to his company but he has to make a pretty big adjustment if he wants to make it to the next level.
Take a guess. I’ll wait.
James is too neutral.
The Problem with Neutrality
In any large organization, one is unlikely to be successful–or remain employed, for that matter–by being a complete maverick. (Small companies are often different, but let’s focus on larger ones in this post.) Big company management is rooted in a military model in which the following are valued:
- Following orders
- Playing by the rules
- Not questioning authority
From an information management perspective, this means writing reports requested by users, fixing data issues, gather requirements, and the like. Note that none of these activities remotely resembles leadership.
There’s obviously a larger point here about workplace rights and behavior. I can’t speak about other countries, but at least in the United States, the Constitution does not exist in the workplace–subject to a few limitations surrounding discrimination, whistle-blowing, and the like. For instance, the government grants you the right to say just about whatever you want, but your manager can fire you for doing so. Brass tacks: Speak your mind if you like, just be ready to pay the price.
But rank has always its privileges. The standard is not the same across the board and everyone is not equal. An entry-level AP clerk finds himself on a tighter leash than a CXO. Mid-level managers aren’t expected to set organizational directions and strategies. As you move up the corporate ladder, it is very difficult–if not impossible–to be neutral if you want to be keep ascending. You’re going to have to make some different judgment calls, even and especially when the data tell you different and even conflicting things.
For instance, let’s say that ABC is conflicted about ways build a database. Some people are reluctant to depart from traditional methods. They want to stay the course with row-based databases. Then there are others willing to embrace the unknown–i.e., move to a columnar database.
Consider the following:
- Is this a big decision? Yes.
- Is this bell hard to unring? Absolutley.
- Should James carefully consider each viewpoint? Of course.
- Should James ultimately have an opinion and a recommendation?
How can he not?
That’s not to say that everyone ought to disagree with everyone to superfluously flex muscle. Nor do I claim that the ability to reach compromise should be minimized. On the contrary, it’s incredibly valuable to see both sides of an issue and broker a truce.
But there’s only so much you can do when you’re neutral.
What say you?
Many organizations continue to struggle implementing BI tools. We all know the usual suspects:
- bad data
- the tendency of organizations to over-customize, causing problems down the road
- the difficult of getting buy-in on standard definitions
- other IT priorities
- vendor challenges
- end user unwillingness to let go of old standbys
As a result, relatively few executives and decision makers in large organizations have access to simple yet effective dashboards. Lacking accurate and timely data–and a means to comprehend it, they often make suboptimal decisions. Call these errors of commission. What’s more, they’ll often fail to make any decision when they should. Call these errors of omission.
Ironically, small business owners like me can quickly visualize their data and drill down as needed.
Today, it doesn’t take a great deal of time or money be able to interact with and visualize your data. Many sites provide dashboards by default: Google (via Analytics), Kickstarter, WordPress, etc. We see here that being small can be a major advantage, especially since pre-built dashboards come with many products and web-based services these days.
Going the Pre-Built Route
There’s no panacea to the aforementioned data and organizational problems. However, today businesses of all types are taking a shortcut by purchasing and deploying Pre-Built Analytic Applications (PAAs). PAAs can allow end users to better understand their data. In theory, they can then make superior decisions.
PAAs represent a legitimate alternative to custom-built BI tools. These ready-made, off-the-shelf applications come pre-loaded with industry-specific metrics and definitions. Once linked to an organization’s systems, PAAs can provide meaningful insight into micro and macro trends.
PAAs are increasingly being deployed in industries such as banking, insurance health care, and retail in a number of different lines of business. Implementations have run the gamut, including customer service, human resources, finances, and sales.
Let’s look at HR for a moment. PAAs exist that allow organizations to do the following:
- Quickly identify their best employees–as well as those who are not meeting expectations.
- Track progress of employees as they are trained and advanced within the organization.
- Allow managers to monitor how employees are performing against a vast array of key metrics.
- Spot disturbing trends in absenteeism, sickness, and employee turnover.
- Identify emerging payroll-related issues, such as excessive overtime in a department, store, or division.
This isn’t rocket science here. It’s easier to spot outliers on a graph or chart than it is on a plain-text report or a spreadsheet.
Simon Says: Visualization and Interactivity are Key
It doesn’t matter if your organization uses a PAA, a best-of-breed and customized BI tool such as IBM Cognos, or an open source solution such as Pentaho. Make sure that your users have dashboard-like functionality.
The ability to visualize data is becoming increasingly useful and important. After all, how realistic is it to expect your employees to make accurate decisions with a morass of spreadsheets, databases, and static reports–especially when those items are static and non-interactive? Even free and low-cost tools such as Tableau can provide enormously valuable insights into your company’s data. Don’t believe me? Check out this interactive dashboard for bankers.
It doesn’t make sense that small business owners like me can more easily visualize and interact with their data than organizations with hundreds of times as many resources. Look at different ways to change this.
What say you?
It happens all to often: A sales and marketing team gathers for their weekly status meeting and each person is armed with the same report showing different results. A debate ensues as to whose report has the accurate numbers. Attentions are diverted, time is wasted and productivity is lost.
How strongly can you trust your information intelligence? What should you do when your system becomes more of a liability than an asset?
There is one basic truth I know about information systems: When it comes to number crunching, the reports you build are only as intelligent as the data you input and the filters you create to organize it. When installing a CRM system, I try to do the following at a minimum:
1) Designate a system owner and set user permission levels for accessing and editing data.
2) Determine what intelligence needs reported to management.
3) Determine what data fields are needed to report on that create this intelligence.
4) Create your data fields in the proper, reportable format (i.e. if it’s a number, create a numeric field, not a text field).
5) Define your report filters based on the fields needed to report the intelligence.
6) Build standard report templates that everyone has access to but cannot change without consent from the system owner.
7) Communicate to staff using the system the importance of entering good quality data. Set required fields as needed.
What are your thoughts? How do you ensure your information intelligence is trustworthy?
I’m working on a project that involves using various marketing metrics as a baseline to structure and fund future team projects. I’ve gathered my baseline marketing data from Google Analytics, determined my target percentiles, factored engagement metrics, and structured an incentive plan based on time and resources that I’m certain will work for both the client and project team. I feel confident this is a sound and measurable methodology to base our project and incentive goals on. But is it?
This same scenario happens all too often in project planning. We over-think, over-quantify, and over-organize our plan and forget to factor in the most important component: flexibility. Think about it: if any team member received a project plan where success was based solely on mathematical measurements such as “hits received per day” or “comments generated per week” we’d be focusing only on the numbers and not on the larger picture. Where does creativity get calculated? Or innovation? What if half-way through the project, a team member finds a better metric, but because it’s not included in the initial project plan, chooses not to measure it? Likewise, if the project plan is so specific that it holds only certain people accountable for certain goals, how likely is it that our team members will be willing to work together versus individually? Are we eliminating the opportunity to reap the benefits of collaboration and brainstorming? Creating silos in an organization operating in an industry that has just accomplished knocking them down?
As an analytical person by nature, I love data. I love numbers. They help me gauge most of my successes in work and home life. But the more experienced I become in the workplace, I’m realizing they are not the “golden ticket” to measuring all results, especially when teams are involved. It can’t just be about hitting predetermined goals because chances are you’ve forgot a variable or your baseline has changed. Also and more importantly, despite the fact that we live in an increasingly connected and networked world, people will work themselves back into a cubicle if you create a plan that allows them to. Factoring in time for teamwork, flexibility, creativity and collaboration may impact your timeframe, but it also will impact a team’s ability to create innovative products and services. And at the end of the day, those are the ones that sell.
My friend Robert Hilliard on this site recently wrote about the oft-discussed issue of information management. He writes:
I argue that although the creation of new data in absolute terms (as opposed to the retention of existing data) means the innovation is genuinely new, it does not become disruptive to existing business unless it actually enhances the connections to current data. Creating new data on its own doesn’t add much value to an existing business, but creating more links definitely does.
Creating new data–or cuts of existing data–is often of questionable value. I always ask myself: Is this truly needed given what the organization already has? Of course, therein lies the problem.
Consider the following questions for many large organizations:
- How many reports merely reflect essentially the same data as other, existing but unknown ones?
- How many people or departments inside a big organization insist upon their own BI tools because they don’t play well with other people or departments?
- Do any organizations have accurate and comprehensive lists of all of the tools throughout?
Color me pessimistic, but at large companies I’d argue that the answers to these questions is typically “We don’t know.” What’s more, at these organizations, there’s rarely a great deal of data governance. As a result, a state of anarchy causes myriad problems–typically leading to requests for new “stuff” when old stuff will suffice..
The Sort-Of Simple Solution
Is there a simple solution to this problem? Not really, but let me tell you about a radical method that several CIOs have used to determine exactly which reports, BI tools, and data sources are needed: Turning everything off.
Let me explain.
A CIO sends an email to all managers explaining that there are too many applications, reports, data sources, etc. Perhaps this email goes out in February and states that, unless s/he hears otherwise, all “stuff” will be turned off at the end of the year. Perhaps several other reminders are sent as well and the topic is posted internally and referenced at meetings
No one is messing with payroll applications. Major ERP and CRM applications need to remain active, but everything else is fair game. Also note that silence equals consent: non-responses are tacit approvals. Finally, even in sacred cows such as ERP and CRM, there typically are reports that are simply never used anymore.
After the Email
Diligent managers will respond immediately. They will list their reports, applications, and data sources cannot be deactivated. As is often the case, however, many of the emails and announcements will be ignored. At the end of the year, all “stuff” previously ignored are turned off. End of story. Systems are retired. Reports no longer need to be generated because obviously no one was using them.
This is obviously a way to remove superflous reports. Think Office Space and TPS Reports.
When CIOs have boldly taken this step, many have been silent. Then, several months or years post-deactivation, a few stragglers have come back with requests to resurrect those now dormant legacy items.
Is this risky? Sure. Is it arguably a good move? Yes. Everyone tends to overestimate what they need for fear of budget and headcount cuts, especially at large organizations.
What say you?
A long-time IT professional friend of mine and I recently had a very interesting discussion about the merits of different technologies. He works in the business intelligence (BI) space for a major financial institution and, as part of his job, has to oversee the implementation of different BI tools for his organization. In this post, I want to relay the crux of that conversation because it relates to organizations’ IT procurement priorities.
For any company in any economy, resources are scarce. There are limitations to even what cash-laden organizations such as Microsoft and Google can do. So, when faced with decisions related to data management, does one set of tools make sense for an organization?
Organizational Data Maturity Model
In his excellent book The Data Asset, Tony Fisher lays out the following model for organizational data maturity:
The model underpins his book and, in essence, can be summarized as follows: Organizations cannot leapfrog stages. Undisciplined organizations need to become reactive before progressing to proactive and governed states.
So, returning to the conversation with my friend, is DQ or BI more important to an organization? The answer, of course, is that it depends on the state in which the organization finds itself.
For example, consider Company X, an undisciplined organization barely able to keep the lights on. Data management is nonexistent and duplicate, incomplete, or inaccurate records make even basic reporting and operations difficult–if not impossible. As a result, a BI tool may yield insights into employee, customer, and/or vendor behavior. But will those insights be accurate? Doubtful. In this case, a DQ tool makes a great deal more sense. Clean up data and begin to instill a culture of data governance before getting all fancy with OLAP cubes.
Now, let’s look at Company Y, a proactive organization with very few data-oriented issues. While not perfect, master records are maintained quite judiciously and, unlike Company X, basic reporting is a breeze. Few people internally ever doubt the accuracy of reports and analyses. As such, it is ready for a BI tool. To be sure, a DQ tool couldn’t hurt, but Company Y is ready to take the next step.
Don’t look at the merits of BI and DQ tools in isolation. The state of your organization’s data management is an incredibly important factor in making the decision about which tools to deploy–and when. Further, resist the temptation (often driven by senior executives) to go with “quick fix” sexy BI tools when your organization isn’t ready. Yes, they are capable of producing interactive charts, reports, and dashboards. But that’s not the right question. Rather, are all of these potentially useful given the state of your organization and its data? Remember that better access to–and charts containing–bad data aren’t as useful as limited access to pure data.
DQ tools may not have the sizzle of their BI counterparts. However, if they are purchased, implemented, and utilized effectively throughout the organization, they sure will make BI tools more potent when you get there. The best organizations have a method to buying and implementing different tools.
What say you?
TODAY: Mon, May 20, 2013May2013