Open Framework, Information Management Strategy & Collaborative Governance | Data & Social Methodology - MIKE2.0 Methodology
Collapse Expand Close

To join, please contact us.

Improve MIKE 2.0
Collapse Expand Close
Need somewhere to start? How about the most wanted pages; or the pages we know need more work; or even the stub that somebody else has started, but hasn't been able to finish. Or create a ticket for any issues you have found.

Archive for the ‘Enterprise2.0’ Category

by: Robert.hillard
24  Feb  2018

White collar productivity

Have you ever faced transposing a row in a spreadsheet to a column, or perhaps tried to make a repeated change to values and wondered how to do it through menu options or functions? After wasting what feels like an eternity, you give up and do it manually one cell at a time? You’ve fallen victim to the productivity skills gap that has emerged as our offices have become more digital.

Any preparation for the future of work needs to tackle this skills gap. We spend a lot of time training blue collar workers how to do individual tasks in the safest and most and productive way. The same is not true of white collar workers who are paired with increasingly sophisticated digital tools.

There is a counterargument that such workers should have “freedom within the frame”. While there are many great examples where workforces have been given freedom to achieve an outcome within certain boundaries, individual workers have always had the specific skills they need to contribute to the overall goal. Hopefully professionals will continue to have a higher degree of freedom to achieve outcomes in a way that works for them, but that doesn’t abdicate employers and educators from providing the technical skills they need to be successful in a digital workplace.

It’s very likely that artificial intelligence will help intercept automation opportunities in the future. We’re already seeing tips pop-up based on the tasks that we’re trying to do. However, at least in the medium term, this advice is helping with tactical task management, not the big picture of applying digital tools to be a highly productive professional.

The pressure of immediate deadlines mean we often don’t do enough to find the most efficient way to automate or streamline the task at hand. We often fear wasting time looking for a digital capability that simply may not be there, leaving us with a greater loss of productivity than if we’d simply got on and done the job in the same laborious way as we’ve always done.

One of the biggest changes in our digital-enabled office is the huge variety of ways that individuals undertake the same task. While this can be empowering for some workers, it has created an even greater difference in productivity between the best and the worst performing white collar workers than ever before.

In years gone by there were limited ways to perform most administration in our professional lives. Today, the options have exploded and there is almost no standard. Email etiquette is in flux, with some people having thousands of unread email and no chance of ever catching-up. Others, however, seem to be able to keep on top of their workload with a few clever shortcuts.

My colleagues at Deloitte argue that three factors contribute to an executive’s success: time, talent and relationships. As our world becomes more electronic, we are becoming slaves to our electronic devices. It is amazing how many executives fail to take control of their diaries (time), productively use available technology to manage their teams (talent) or leverage digital channels to streamline their peer interactions (relationships).

An investment in office tools, and how they’re applied, can make a huge difference. Sadly, techniques for using software such as Microsoft Outlook are seldom taught and are often regarded as too trivial for senior professional. To illustrate, most of us have tools that include the capability to block-out time to protect time from distraction, alerts for meetings added in the distant future that are time wasters and the ability to delegate tasks.

Deciding how much and when to streamline or even automate a task shouldn’t be a hard one. The candidate activity might be the spreadsheet you produce twice a year, a regular email to which you just reply “approved” or a document that requires input from many team members. Professionals don’t need to know about every feature or capability in the tools they use, rather they should have an understanding of what is likely to be possible and access to help to learn how to find those features when needed.

The tendency is under-invest in digital office skills. To make the most of the future of work, we should take this on in small, manageable, chunks. A simple goal is to ensure that one skill is learnt by each staff member each week to add to their personal productivity.

There is the potential to quickly eliminate thousands of unread emails from inboxes, reduce the stress of information overload and save wasted effort by senior staff on menial activities. Business will see a measurable productivity benefit after just a few months.

Category: Enterprise2.0
No Comments »

by: Robert.hillard
28  Jan  2018

Balance of power

The digital economy is transforming every corner of our lives. The changes in the businesses we interact with and the way many of us are employed mean subtle but important shifts in power. Patronage, social license and convention that have served prior generations well no longer stand-up in the brave new world we are now navigating.

It was Robyn Morgan who said that “information is power” when arguing for a rebalancing of power between the genders. She went on to argue “The secreting or hoarding of knowledge or information may be an act of tyranny camouflaged as humility”. Information and power is not just an issue between genders but also between the parties involved in employment, marketing, sales and services.

Last month I talked about the importance of trust enabled through new channels (see Trust in the digital economy). Although the new platforms that are connecting service and product providers with new customers are brokering trust, they are not seeking to balance power. Whether it is ride sharing, brokering odd jobs or acting as a product marketplace, the information and power is being centralised by the platform owner.

I’ve also written previously about the creation of new jobs in the digital economy (see More but not better jobs). While there are more jobs being created than destroyed, many of the new jobs are casual or “gig” piecework. Guy Standing, a British economist, coined the term “precariat” (combining “precarious” with “proletariat”) to describe an emerging working poor who are relying on insecure work through these platforms.

We are still in the first generation of the digital economy. It is a generally accepted phenomenon that platform businesses are “winner take all”. Although this is seldom entirely true, it has been the case that social media, ride sharing and television streaming have been dominated by a small number of businesses.

The reason that platform businesses tend to dominate is the information they accumulate. The owner of the information has disproportionate power, including the ability to set prices for the providers (often the workers) and keep competition out who lack the same market information. While power often begets power, it also encourages those who lack power to find a way to overthrow perceived inequality.

One of the models that has the potential to shake-up existing platforms is the emergence of cooperative platforms where the providers group together to also own the platform.

During the nineteenth and twentieth centuries, worker and agricultural cooperatives grew around the world in response to power imbalances or vacuums where business either displayed unfair practices or were unprepared to support a sector.

We are starting to see cooperatives rise again in response to concerns about the practices of commercial platforms by workers in the gig economy. Recent examples include cooperatives of taxi drivers (e.g., Denver’s Green Taxi Cooperative), photographers (e.g., Stocksy United) and home services (e.g., Up & Go).

Trade unions have found it particularly difficult to adapt to the gig economy and support those workers dependent on new platforms. A significant part of the problem is that unions traditionally divvy up the market by particular skills, qualifications or sectors. This model worked when education preceded a lifelong career. The economy, through changes wrought by technology, is moving to one of constant change throughout careers requiring ongoing education and a wide range of different specialisations throughout most people’s lives.

Any model representing these workers, who are becoming more important every year, will need to recognise the individual rather than the particular skill they exhibit at a point in time. Perhaps the union movement of the future will become a platform enabling those seeking work to combine with those providing education and employers needing particular skills.

Regardless of how skills are gained, the idea of “set and forget” education for life doesn’t fit with the new era of employment. Education needs to find a new continuous learning revenue model. Like other businesses disrupted by digital trends, the power held by educational institutions was, in large part, due to a monopoly on information and knowledge which has now been democratised (see Universities disrupted but not displaced).

In a world where workers are constantly adjusting their skill based on continuous education, a better approach might be to have the worker organisations combine with employers to organise and pay for the education. Perhaps the platforms of the future will facilitate and fund the constant development of skills rather than rely on the tactical exploitation of skills earned elsewhere.

The information and power has moved from old structures to new in our day to day work. As a result, completely new categories of work have been created. The changes, though, in the power balance are far from being sorted through. History tells us they will be, the question is whether it will be solutions from the industrial revolution reinvented or something entirely new.

Category: Enterprise2.0
No Comments »

by: Robert.hillard
22  Dec  2017

Trust in the digital economy

As the year wraps-up, it is the season when we feel good will towards all. I am reminded that our economy works because of the trust that we each have in each other. The information economy has enabled us to interact with a wider range of parties than ever before and, more importantly, develop a level of trust in those relationships.

In the past, companies offered a huge range of services because it was their brand that provided trust. Today, specialisation is the name of the game with the internet offering research and reviews for potential customers. New platforms are allowing individuals to directly engage others for car rides, accommodation and ad-hoc tasks. These are only possible because identity and assurance of quality are offered through the power of the crowd.

Arguably, our individual reputation is more important today than ever before. It is harder for an unscrupulous business operator to move to a new town and have their history disappear. Even without a perfect identity system, companies and individuals are part of social networks and subject to reviews online which are hard to purge.

That isn’t to say that there aren’t bad operators taking naïve consumers for a ride. However, it is arguably harder than ever to maintain a false façade in the era of social media. Fake identities, where there are a substantial number of social connections, tend to fall apart quickly. See Login with social media.

When people trust each other, there is the ability to navigate inevitable disagreements. Contracts describe the business relationship between two parties. From time-to-time, interpretation of a contract is open to question and we use courts to find a solution. The investment required in a legal process, both in cost and time, means that it is usually a last resort.

At the same time as technology is providing a means of better identifying parties who want to do business, and providing a basis for trust, an alternative where no-one needs to trust anyone is emerging. So-called “trust no-one” business using blockchain is exciting entrepreneurs and innovators the world over. Of particular interest are forms of smart contracts which are being used by cryptocurrencies to provide rules-based settlement of agreements.

Smart contracts use code to define every rule and exclude the opportunity to escalate to a third party given that the funds are locked through cryptography. The great thing about computer code is that it removes ambiguity. The trouble with computer code is that there is always the opportunity for a bug to emerge and, with no room for human intervention, there is little ability to escalate to solve the problem.

We’ve seen this in action through an issue of Ether, one of the cryptocurrencies: ‘$300m in cryptocurrency’ accidentally lost forever due to bug. In this case the development community had to consider whether to wind everything back, effectively reversing time. Given the small number of transactions actually happening using smart contracts and cryptocurrencies, these issues represent a material proportion of the activity.

Perhaps the choice is like a safe. Do we lock our assets away in such a way that only we hold the key, with there is no recourse if something happens to that key, or are we willing to trust at least a few such as government or financial institutions? Stories abound of these lost keys and the heartbreak that follows (for example, see Don’t tell my wife).

Although it seems initially appealing to hand over the role of courts, banks and other institutions to technology, the downside of removing human judgement and failsafe measures is larger than the overhead that they add to our economy. Ultimately, though, doing away with trust means that all business is semi-anonymous and becomes transactional. Without human interaction, the temptation to try and cheat grows, probably at least as fast as the smart contract technology closes each loophole and corrects each bug.

It is ironic that with all the concern over trust in currency and contracts, our major business tool remains email. Our email largely relies on trust given that few verify the identity of the email sender. This is the ultimate proof that trust works in business given identity could be (and really should be) easily solved by the addition of digital signatures.

Rather than give up on millennia of business enabled through trust, we should embrace the Russian proverb (made famous by Ronald Reagan): “trust, but verify”. By the smart use of social media, identity and exciting business platforms we can enable an enormous volume of business to be done with little overhead. The exceptions should then be escalated to institutions such as courts run by humans not computer code.

Tags: , ,
Category: Enterprise2.0
No Comments »

by: Robert.hillard
27  Nov  2017

Rethinking failure

Imagine an organisation with ten job levels (ranked from entry to CEO). In the first year, there are 512 people at the lowest level who are all assigned to projects, half of which succeed and the other half fail in the first 18 months. The successful 256 are promoted to the next level and again assigned to projects with a 50/50 chance of success. The 128 whose projects succeed are again promoted. Following this pattern, by the ninth promotion, there are just two people at this level from the original group of 512. From these two, a final pair of projects and, assuming 18 months per projects, after 15 years one successful candidate is ready to be CEO!

Obviously, this mythical example is simplistic and no leader is chosen exclusively this way. But, there is no doubt that the delivery heroes who have a faultless history of success are at the front of the queue.

With so many rewards being given to those who can demonstrate a long sequence of successes, people have learned to make sure there are no black marks of failure against their name. Yet, we know that learning through a mixture of success and failure is the best way of giving future leaders the tools they need to deal with their most challenging roles.

Everyone wants to be recognised as being innovative and most people talk about the importance of failure in the innovation process. Learning, whether individual or organisational, comes from trial and error, it is an intrinsic part of being human. That’s why we do things over and over again at school until the sum of the failures sets us up for success.

Knowing that so many organisations are talking about a culture of failing quickly, it is always interesting to ask a room of professionals if they have ever been rewarded for failing. Almost no hands ever go up in the air. Rewards and actions go together and leads to the avoidance of assignments that risk failure. The downside of organisations being motivated in this way include inflated budgets, limited ambition and a lack of hard measures of success.

The impact on budgets comes from leaders choosing to pad their plans to make sure they under promise and over deliver. Without the tension of budget pressure, inevitably costs are higher than they need to be. That’s fine if what you’re doing is mission critical, such as building a spacecraft or life support system, but otherwise it doesn’t make good economic sense.

Similarly, in an environment that punishes failure, an ambitious scope is to be avoided at all costs. Certainly, it would be career limiting to take on a “moonshot”. Most organisations should avoid too many moonshots, but you want at least some of your teams aiming to reach for the stars and at least get to the Moon.

Finally, the best way to avoid missing a target is to make it soft rather than something measurable. The great thing about soft measures is that they are usually open to reinterpretation after the event. I believe that anything can be measured if you try hard enough. But, if you will be punished for these measures it isn’t surprising many choose to avoid any documented commitments.

The most ambitious of targets are best thought of as an experiment. By definition, experiments define a default hypothesis and an alternative. Failing to prove the alternative is an outcome in its own right. By not celebrating those projects (or experiments) that fail to prove the alternative, organisations keep on trialling the same idea. Institutional memory should not be the only way to recall that something has been tried before.

Search for “startup failure” online and you’ll see a myriad of events that bring speakers together to share what went wrong. Entrepreneurs are learning that there is often more to learn from the mistakes of others than from those that have been a success.

Search for “project failure” on any large enterprise intranet and you’ll be very unlikely to hear the confessions and learnings from any of the organisation’s leading executives. Why is it that entrepreneurs know the importance of learning from past mistakes while executives don’t see this as a priority?

A healthy organisation will seek a spread of risk and return across their portfolio of activities and projects. The greater the risk, the higher the return that should be expected.

For projects, such as technology implementation or engineering, a failure can range from missed deadlines to complete abandonment of the scope. When it comes to taking risk on implementation timeframes, the rate of overrun, and the impact of the overruns, should be less than the total savings through increasing the risk. While the benefit of risking a more aggressive scope should be greater than the loss due to the rework required to wind back to a more modest set of objectives.

Businesses that only ever punish failure encourage everyone to leave something in reserve, while the best organisations encourage everyone to stretch themselves and their teams with a transparent approach to taking appropriate risks. Business transformation is more than the sum of its parts and requires the right combination of ambition combined with continuous learning.

Tags: ,
Category: Enterprise2.0
No Comments »

by: Robert.hillard
28  Oct  2017

Leaders need the technical detail

I recently heard a financial services executive on the radio talking about Bitcoin. When a listener asked him to explain what it was, he couldn’t. I’m constantly amazed by the number of leaders who aren’t across the detail of important innovations that are candidates to revolutionise their businesses in the near future.

There is no doubt that our economies are getting more complicated. One response across business has been the gradual disappearance of traditional conglomerates to be replaced by larger, more specialised, businesses that do a small number of things very well. Specialisation makes organisations more efficient but also increases the risk of disruption. If the thing a business does well, such as issue credit cards, sell books or make trains is challenged by a new technology then it can bring the whole enterprise down.

Good examples of changes that are coming with more that is unknown than known include cyber currencies, blockchain, quantum computing, artificial intelligence, smart cities, augmented reality and additive manufacturing.

These are some of the technologies that are likely to drive big decisions for leaders in the coming years. Ambit claims are being thrown around which may or may not be true. For example, some are saying that quantum computing is going to make conventional computing redundant, blockchain is as fundamental to the internet as the invention of the World Wide Web and artificial intelligence is bringing us closer to “the singularity” where computers outsmart humans.

If any of these claims, and many like them, are true then it is safe to say that many businesses that exist today will be extinct in a few short years. But for every claim there is a counter-position. Just as many experts argue that cyber currencies will soon fall under the supervision of existing regulators as predict a completely open financial system. Similarly, city planners disagree about whether smart transport will bring more people into central business districts or if more work will be distributed through neighbourhood clusters.

In each case, executives and boards cannot rely on others to be the experts. Those businesses that are on the winning side of history will have leaders who have taken the time to educate themselves on the most important innovations that are sitting in today’s laboratories. More than simply predicting the outcome of today’s research, these leaders can play a role in shaping the work of scientists and engineers putting their businesses in the box seat to tailor the future. When you think of organisations whose leaders fall into this category, Netflix, Amazon and Tesla immediately come to mind, but there is no reason to assume this management mindset should be limited to these cutting-edge video, retail or car companies!

Left on their own, research teams working on initiatives like quantum technologies, blockchain and the next generations of manufacturing technologies are often missing the opportunity to commercialise aspects of their solutions early. With the benefit of an understanding of the breadth of today’s challenges, sometimes incomplete research can deliver incredible advances.

For example, quantum computing is driving research that could yield interesting materials and sensors long before viable computing devices are even close to commercialisation. Medical, mining and energy businesses are among those that should be on the lookout for more immediate applications and perhaps partner on research efforts.

Similarly, augmented reality developers are looking for real-world applications. Retailers have the opportunity to apply massive amounts of valuable information visually if their leaders know what the technology can do. A clever department store might better challenge online competitors by making their physical store a giant website, providing all the product information that the store staff are going to find it hard to know.

Of course, it isn’t necessary for every leader to be capable of fundamental quantum computing research or able to engineer smart devices, but it is important they can delve into the details with experts and challenge them to explain their work in a digestible way. It is amazing how often an expert will overcomplicate something when talking to someone who doesn’t express a desire to understand the detail, but will find the simple explanation when confronted by a genuinely interested leader.

Such technical confidence has other benefits with executive teams realising that they can’t abdicate business and customer technology decision making in favour of specialist teams. With everything from finance to product systems directly affecting the capability of the overall business, there is no longer a place for leadership at a distance.

The result of such interactions can be amazing. Some large organisations are applying approaches to agile management based on experiences in their technology teams (see The agile working style started in tech but it could work for banks). Others are applying robotics and augmented reality in inconvenient or risky places where people otherwise had to go (for example, see Miniature robots speed power generator inspections).

It is no longer about the technology experts getting closer to the business, it is about everyone collaborating to find innovation in unexpected places and shaping the future!

Category: Enterprise2.0
No Comments »

by: Robert.hillard
26  Sep  2017

Three things every project needs

It’s wrong to think that all a project needs is a scope, budget and timeframe. The three things that separate the best projects from the rest are: insight (into the future), simplification (of the business today) and inspiration (through new capabilities).

Even a cursory glance at the news shows how fast things are changing. The rate of change for business and government is greater than at any time in the working lives of the current generation of leaders. Disruption in some form is hitting everything from retail through to healthcare and it doesn’t seem to matter whether it is the private or public sector.

The improvement over recent decades in project and programme management capabilities mean that scope, budget and timeframes are now ingrained into almost every team. Most organisations have a portfolio of these projects from which you can deduce their ambitions for the year or few years ahead. Unfortunately in many cases, the sum of the parts isn’t as transformational as the changing times we live in really requires.

Organisations need to look again at their projects and make sure they move beyond scope, budget and timeframe to insight, simplification and inspiration.

1. Insight into the future
Any project that assumes that the context in which it is operating at commencement is the same as at implementation is doomed to be behind before it ever sees the light of day. Even three months in today’s business world is enough time for fundamental changes in the relationship with suppliers, competitive landscape, customer expectations and the working environment.

Too many project teams make the assumption that what is true now, or even in the immediate future, is true forever. They also underestimate their project’s ongoing impact on the organisation way into the future.

It is hard to make predictions for the business and technology environment a decade or more out. Projects should, however, at least identify different scenarios, some of which will be uncomfortable. Scenarios, and their impacts, allow those that follow to make sense of the portfolio of projects and better build out a transformation narrative.

For example: urban infrastructure projects can describe a smart cities future; retail fitout projects should navigate the plethora of predictions on changing customer behaviour and supply chain system projects need to anticipate radical new business platforms that remove whole layers from the business architecture.

Some changes are nearly universally anticipated and should form part of every scenario such as greater workforce automation, improved technology infrastructure and greater government regulation of digital markets.

2. Simplification of the business today
Complexity is the enemy of agility and in times of change, agility is critical. It is amazing that so many executives and boards lament the opportunity of their lean, startup, competitors who don’t have the legacy that they have to deal with. This is not just technology but also long-defunct products, outdated business processes and expensive infrastructure.

Established businesses should have an advantage over their startup rivals, they have market knowledge, infrastructure and large amounts of data. To be ready to adapt they also need to have a constant focus on simplification.

It would be great to do a one-off spring clean of our organisations, the reality is that this is too hard and unlikely to get the sustained focus that it needs except in a crisis and by then it’s too late. The best approach is to make simplification the goal of every project.

Where today we usually measure a project’s success by time, budget and scope, we should add the total enterprise complexity as well. Of course many projects will have to add to the complexity of the part of the business where they operate, but that doesn’t mean that a completely different part of the organisation can’t be the target of offsetting simplification.

3. Inspiration through new capabilities
Finally, every project needs to be delivering something new. Business is increasingly looking to somehow change their relationship with customers, find new ways of connecting suppliers and customers and engage better with their staff. Understanding the strategic goal of the new and how it fits with a wider business goal is critical.

Projects that seek to simply adapt to the future (such as future of work, future of cities or future of transport) lack the focus to motivate teams. Projects that only streamline and simplify the organisation (such as a new enterprise operational system) lack the inspiration to excite and maintain the motivation of teams. People often gravitate to projects that have something completely new as part of their scope, even when it may not be a major component.

Every project is an opportunity to experiment, whether it be with new business platforms, new technology or new ways of working. While not every project is a transformation, many transformations are incomplete through a lack of a something that makes stakeholders go “wow”. It is a good idea to include an element in every project that gets the heart racing in the midst of even the most benign of changes.

There is much to be done
The years in front of us require organisations to take on huge amounts of change. It is oft quoted that organisational lifespans are getting shorter. I don’t accept this, I think that mergers and acquisitions account for many of the movements as the best businesses find alignments and synergies that add value to shareholders, customers and suppliers. However, there is no doubt that those organisations that don’t take ownership of transformation find themselves a victim of it.

There is so much that every organisation can do to create value through transformation for their stakeholders. Just delivering a vision of the future, improving today’s business process or adding new functionality over the top of existing processes is incomplete. It is only by harmonising all three that genuine and sustainable change is possible.

Category: Enterprise2.0
No Comments »

by: Robert.hillard
27  Aug  2017

Your desk is a guide to the future of work

The future of work is the topic on everyone’s lips. The talk of automation and artificial intelligence can seem really abstract and alien, making the future seem scarier than really needs to be the case. A good way for white collar workers to think about what lies ahead is by looking at their office surroundings and how those might change in the coming years.

Anyone worried about what work might look like in the coming decades should remember a worker of the 1970s would feel like a fish out of water if they were magically transported to today. The desk of a white collar, clerical, worker would likely have had “in” and “out” trays, stationary and a phone (an extension of a central number). By the 1980s, they may have had their own direct phone number and possibly, depending on their job, access to a computer terminal for very basic tasks. Reports were manually typed, with large offices and senior staff having access to typists who would convert their handwritten notes into neat reports. Rows of those typists were exacting workers who had to mix speed with accuracy and were the ultimate slaves to their desks.

The desk of the turn of the century was very different. The trays were rapidly disappearing. Every desk had a personal computer with a graphical interface and the tethered phone was giving way to IP telephony allowing hot desking to make an appearance. The basic mobile phones and dial-up networking that most workers had at home meant that working remotely was becoming possible, if not practical, for many tasks. Working from home was still called “teleworking” referring to the use of the telephone as the predominant infrastructure.

Ten years later, the desk didn’t look very different, just a little more efficient. Internet connections were faster but the equipment was fundamentally the same. Although smart phones were starting to appear, they weren’t ubiquitous and the functions to which they were applied were basic.

Today the desk is starting to change in much more fundamental ways, but the transformation is no more dramatic than the changes that we’ve gone through in the relatively recent past.

Our desk is finally less bound by paper with some evidence that the US, at least, is seeing a reduction in its use as the form factor of tablets is encouraging less paper (having said that, at about 10,000 sheets of paper per worker per year it is still very high). Our technology is also moving from being a passive tool of efficiency to an active driver of activity.

The balance of power may also have turned. Equipment on our desks are starting to monitor the work being done and the worker themselves. Testing whether the white-collar worker is being productive or, when working remotely, whether they are being active at their desk. This is a form of working for the machine rather than the machines working for us and is likely to be the subject of debate in years to come.

Even more dramatic than simply detecting whether the worker is active, the equipment of our desk is increasingly able to allocate tasks between workers. While every good leader argues that we should measure outcomes and outputs rather than effort, a world where workers can be remote and be paid by output can lead to problems.

When there is a break between the supervisor and the supervised, the nature of competition means the rate individuals get paid often gets pushed down. We are already seeing this effect in the so-called “gig economy”. It is likely that governments will need to step in to protect workers, particularly in fields where there is substantial competition.

If understanding the desk of the future is important, what will be surrounding our workspace in the coming decade? For the first time in a long time, the form factor of the devices on that desk are far from “one size fits all”. The single hinge notebook has given way to all forms of tablets. The desk phone is gone and the fax is long gone. It even seems that voice control is finally finding form in technology dubbed “beyond glass” but also challenging the fully open plan office.

Electronic communications are also rapidly changing. Social networks are merging with messaging services and we are just starting to move past email. It is very likely that more structure will be added to the interactions we have through our work activities, probably driven by our artificial intelligence co-workers.

The future of work is uncertain, but not such a radical shift from what we do today. There are risks that we need to navigate but using history as a guide we should be able to manage the transition without a single change on its own tipping over too many desks at once.

Category: Enterprise2.0
No Comments »

by: Robert.hillard
30  Jul  2017

Defence as the best form of attack

The global economy is powered by business innovation with small and large organisations alike inventing the future for us all. The rapid rate of change brings both opportunities and threats with recent cyber events acting as a wake-up call. Far from being afraid, we should be reminded that we need to design businesses to operate and even thrive in unexpected circumstances.

In the 1970s, companies like Toyota revolutionised manufacturing with “just in time” supply chains. Nothing ever comes for free, for every dollar of stock taken out of the system there is a dollar of contingency and slack also removed. When everything works well, there isn’t a problem, when it doesn’t the flow-on effects can create a supply chain whiplash or bullwhip effect. The best manufacturers in the world solved this by putting enormous pressure on quality to avoid exactly these sort of disruptions.

These are lessons we need to learn as we look to roll out more sophisticated systems in our society such as connected infrastructure and transport and even make the move to autonomous systems. Our society in a few short years is likely to be orders of magnitude more connected through complex networks and supply chains.

Computing generally follows real world models in its first iterations, and its mirroring of best practice supply chains is no exception. Moving from a world where each system was independent to one where they are tightly coupled across corporate boundaries has seen a data supply chain that lends heavily from manufacturers. The addition of cloud computing means that almost every process involves at least two and often more players linked together through a multitude of interdependencies.

This trend is as prevalent in our digitally enabled infrastructure (such as support for our rail networks and energy grids) as it is in digital-only systems (such as banking, telecommunications and government systems). The tighter those linkages the more functions that can be added and the lower the overall cost.

As amazing as the capabilities of our world of technology is, the integration leaves us with almost no room for error or ability to flex in an environment of disruption. For example, our energy grids seem to be becoming more brittle with the rise of interconnections and regular travellers know the impact of airlines operating without slack when something goes wrong.

Like the manufacturing supply chains of the last half century, the key to keeping this technology running is quality with CIOs aiming to keep systems up 24/7. Even small outages, though, have a flow-on effect that is harder to predict and further reaching than the equivalent disruption in a manufacturing process. That’s because the complexity of these system interdependencies has grown exponentially.

The brittle and inflexible nature of complex systems have been one of the reasons that retail has struggled to adjust to the juggernaut of online shopping and manufacturers are still trying to get control of their supply chains back. Recent cyber-attacks, leaving major companies offline, have brought this into stark focus. The attacks have typically encrypted or hijacked one or two systems in the network and brought a brittle environment to breaking point.

The architects of systems and processes tend to design for today’s business. Defensive computing is a paradigm for boxing components in such a way that they work regardless of what happens. This is a mindset that goes beyond testing for the scenarios outlined by stakeholders and moves to safe failovers in the event of anything unexpected.

Defensive measures include having systems work while offline or while counterpart systems are unavailable and when reference data is corrupted or hijacked. If technologists adopt a more defensive mindset, the testing burden is dramatically reduced and the uses of their systems can be extended far beyond the context of their initial design.

Where tightly-coupled systems are brittle, those that have been defensively architected are like flexible buildings that can withstand the buffeting winds of cyber-attacks and the shifting sands of changing business models.

Defensive design requires more expansive thinking about the worst-case scenario for every module. Data should backed-up incrementally and then be thoroughly validated. Connected systems should be assumed to provide completely unexpected and illegitimate responses. Users can be expected to approach every interaction with an almost destructive mindset.

Every part of a system should be independently robust and proactively test that every interaction is valid, rather than only checking for known invalid responses. The more modular and API-driven such a solution is, the more likely it is to be flexible and robust enough to survive cyber-attack as well as business disruption through combination with new applications.

Our infrastructure is never going to be impregnable. Even the strongest perimeter barriers can be breached by one innocent user clicking on the wrong link. Similarly, our business models aren’t invulnerable. The answer is to have each component of the information supply chain designed in a defensive way such that it assumes the worst of even trusted systems, users and competitors.

Businesses building for the worst case, planning to run even when seriously compromised, will find that they more easily weather cyber issues and competitive disruption. Neither should ever come as a surprise.

Tags: , ,
Category: Enterprise2.0, Web2.0
No Comments »

by: Robert.hillard
24  Jun  2017

More but not better jobs

The future of work is the topic on everyone’s lips. We’ve gone from worrying about whether our children will have jobs to worrying about our own place in the workforce. The rise of artificial intelligence and robotics has been at the upper end of twenty first century predictions.

Everyone wants to know whether automation will trigger a massive cut in jobs. This is not academic. Previous predictions that automation would hit employment in the 1970s and 1980s led some countries such as France to move to shorter working weeks and effectively ration available work. The consequences have been large with lost productivity and languishing economic performance which, in turn, has created the very unemployment they were trying to avoid.

The automation of human labour is not new. The development of cotton mills in the industrial revolution and Henry Ford’s production line of the early twentieth century are perfect examples. Both reduced the labour required per unit of production but increased demand caused a net increase in jobs.

The Luddites in the early days of the nineteenth century worried about the impact of automaton and showed their opposition by smashing machines. No lesser technologist than Bill Gates recently said “Right now if a human worker does, you know, $50,000 worth of work in a factory, that income is taxed. If a robot comes in to do the same thing, you’d think that we’d tax the robot at a similar level”. We can interpret his comments as a proposed brake on machines through taxation, a modern equivalent to the machine smashing of the Luddites.

The rise of artificial intelligence is just a continuation of computer-driven automation since the 1970s. We have seen many jobs displaced in that time, yet more work has been created. Word processing has displaced the typing pool. Workflow processing has all but eliminated traditional clerical roles. Yet there are more jobs today than ever before. At about this point many people say “more and better” jobs. It is the latter half, “better” that needs closer examination.

Our jobs can increasingly be described in terms of whether we’re working “on”, “with” or “for” machines.

Those of us who work “on” machines are shaping what they do, we are defining the problems they solve and identifying the questions that need answers. Examples of working “on” the machines include programming, design and data science. These are activities that require an insight that is not within the scope of this second generation of artificial intelligence (see Your insight might protect your job). I would argue that there are the same or slightly more of these jobs than in the past and that the jobs are, largely, as good as ever.

The jobs that work “with” the machines are giving many of their day-to-day repetitive activities to artificial intelligence and traditional technology. Teachers are increasingly handing over much of the content creation and learning interaction to technology and students are largely responding well. This is true at all levels of schooling, from pre-school to university. Teachers are, though, more important than ever, for example see Universities disrupted but not displaced.

Finally, the largest pool of increased employment is working “for” the machines. These are the jobs that are scheduled and managed by technology. At the extreme are the “Mechanical Turks” and other crowd workers who do piecework for a few cents a job. Also in this category are rideshare drivers, online retailing pickers and increasingly some of the more manual health roles. Done well, these jobs can fit into a flexible working arrangement that suits many lifestyles.

To put this third category of jobs into perspective, consider what actually happened with the early nineteenth century cotton mills and then again with the early twentieth century production lines. Far from destroying jobs, more labour than ever before was needed. But as anyone who has watched a period drama or read a Dickensian novel knows, these were not pleasant places to work. Workers were regularly injured or killed, rights were almost non-existent and worker was played-off against worker.

The future of work could see more of these jobs that work “for” machines created with the emphasis on dehumanising and optimising scheduling to suit the needs of technology and employers. Working remotely, or largely instructed by computer, it’s possible that we’ll repeat the mistakes of the past.

But history also gives us reason for optimism. Within a few decades the cotton mills and production lines became much more desirable places to spend a working day. In our far more competitive world, many companies are realising that there is a commercial advantage to eliminating the several decades gap between creating new job and making them desirable. Those companies are winning the war for talent.

At the end of the day, business production of goods and services is for the consumption of humans. The modern services economy means customers are interacting with workers more than ever, and want it to be a social and positive experience. Even the production of goods is increasingly social with the rise of shorter supply chains and a booming “craft” movement of artisan products ranging from food to furniture. Businesses that want to win in this world need employees who are going to portray their brand in a good light and for that their day-to-day work needs to be life affirming.

With a focus on the right things, there is an opportunity for the automation of the coming years to lead to both more and better jobs.

Category: Enterprise2.0, Web2.0
No Comments »

by: Robert.hillard
28  Mar  2017

Post-truth surprises

Unexpected election results around the world have given the media the chance to talk about their favourite topic: themselves! With their experience running polls, the media are very good at predicting the winner out of two established parties or candidates but are periodically blindsided by outsiders or choices that break with convention. In most cases, there were plenty of warnings but it takes hindsight to make experts of us all.

Surprises are coming as thick and fast in business as they are in politics and similarly there are just as many who get them right with perfect hindsight! The same polling and data issues apply to navigating the economy as they do to predicting electoral trends.

The Oxford Dictionary picked “post-truth” as their 2016 word of the year. The term refers to the selective use of facts to support a particular view of the world or narrative. Many are arguing that the surprises we are seeing today are unique to the era we live in. The reality is that the selective use of data has long been a problem, but the information age makes it more common than ever before.

For evidence that poor use of data has led to past surprises, it worth going way back to 1936 when a prominent US publication called The Literary Digest invested in arguably the largest poll of the time. The Literary Digest used their huge sample of more than two million voters to predict the Republican challenger would easily beat the incumbent, President Roosevelt. After Roosevelt won convincingly, The Literary Digest’s demise came shortly thereafter.

As humans, we look for patterns, but are guilty of spotting patterns first in data that validates what we already know. This is “confirmation bias” where we overemphasise a select few facts. In the case of political polls, the individuals or questions picked often reinforces a set of assumptions by those who are doing the polling.

This is as true within organisations as it is in the public arena. Information overload means that we have to filter much more than ever before. With Big Data, we are filtering using algorithms that increasingly depend on Artificial Intelligence (AI).

AI needs to be trained (another word for programming without programmers) on datasets that are chosen by us, leaving open exactly the same confirmation bias issues that have led the media astray. AI can’t make a “cognitive leap” to look beyond the world that the data it was trained on describes (see Your insight might protect your job).

This is a huge business opportunity. Far from seeing an explosion of “earn while you sleep” business models, there is more demand than ever for services that include more human intervention. Amazon Mechanical Turk is one such example where tasks such as categorising photos are farmed out to an army of contractors. Of course, working for the machines in this sort of model is also a path to low paid work, hardly the future that we would hope for the next generation.

The real opportunity in Big Data, even with its automated filtering, is the training and development of a new breed of professionals who will curate the data used to train the AI. Only humans can identify the surprises as they emerge and challenge the choice of data used for analysis.

Information overload is tempting organisations to filter available data, only to be blindsided by sudden moves in sales, inventory or costs. With hindsight, most of these surprises should have been predicted. More and more organisations are challenging the post-truth habits that many professionals have fallen into, broadening the data they look at, changing the business narratives and creating new opportunities as a result.

At the time of writing, automated search engines are under threat of a ban by advertisers sick of their promotions sitting alongside objectionable content. At the turn of the century human curated search lost out in the battle with automation, but the war may not be over yet. As the might of advertising revenue finds voice, demanding something better than automated algorithms can provide, it may be that earlier models may emerge again.

It is possible that the future is more human curation and less automation.

Tags: ,
Category: Business Intelligence, Enterprise2.0, Information Governance
No Comments »

Collapse Expand Close
TODAY: Sat, March 24, 2018
Collapse Expand Close
Recent Comments
Collapse Expand Close