Avoiding Organisational Blindness

08/11/2013
Donald Low explains how behavioural economics can help organisations make better decisions.

The last few years have seen a mushrooming of popular books on behavioural economics, which Mullainathan and Thaler (2000) define as the “combination of psychology and economics that investigates what happens in markets in which some of the agents display human limitations and complications”. These books provide an insight into the psychology of our customers and how they make decisions. The argument is that by acquiring a better understanding of the cognitive abilities and limitations of our customers, firms can design better products and services, and gently “nudge” their customers into making better choices.

The central insight in behavioural economics is that far from being the supremely rational, self-disciplined and interest-maximising agents that we find in economics textbooks, people are subject to a variety of cognitive biases and complications. Their rationality, self-control and self-interest are bounded in ways that have implications for the way firms design their products and services, and structure choices.

But, the research in behavioural economics has focussed mostly on the customer or the citizen’s cognitive limitations. Research which puts the organisation, or leaders of organisations, at the centre of analysis is much less common.  

Blindness in our Organisations

So, what are the cognitive biases that afflict corporations and decision-makers in our organisations?

First, decision-makers are often the victims of the saliency bias. This is the tendency to overestimate the risks that are more vivid, more recent, or more recallable. For instance, we are now more aware of the risks of an economic crisis because the memories of the last one are still fresh in our minds. Prior to that, however, decision-makers dismissed warnings of an impending crisis because the risks of a financial meltdown were neither salient nor easy to imagine.

The problem with the saliency bias is that decision-makers often exhibit a tendency to overreact to crises. Actions which are taken immediately in response to a shock are seldom the most appropriate. For instance, companies which lay off large numbers of skilled employees as an immediate response to a market shock find it extremely difficult to replace these workers once the crisis has passed.

Another problem with the saliency bias is that the urgency to take action diminishes as the thing which the decision-maker was focussing on becomes less salient. Once the immediate danger has passed, decision-makers no longer feel the urgency to fix the underlying problems. The financial meltdown of 2008 is a case in point. At the height of the crisis, regulators spoke of overhauling the financial system and breaking-up financial institutions that were once deemed too big to fail. But, once the immediate crisis had passed, efforts to reform the financial system petered out, and very few structural changes were eventually implemented.

In short, the problem with the saliency bias as an impetus for action is that it leads to over-reaction in the short-term and under-reaction in the long-term. Problems which need careful deliberation and sustained action are, therefore, dismissed. Meanwhile, problems that do not really need intervention attract clumsy, hasty actions.

If the saliency bias sometimes leads decision-makers to overreact in the near-term, a second bias — the sunk cost fallacy — leads them to stick with the status quo longer than is desirable. For instance, it is quite common to hear arguments of why one should continue to finance an unprofitable venture on the grounds that “we’ve already spent so much on that project, and if we cut off funding now, our earlier investments would go to waste.” As any good economist knows, this argument, based on sunk costs, is flawed, and we should be making decisions on the basis of marginal costs and benefits.

A third bias is the confirmation bias, which is the tendency to seek validation of our prevailing beliefs, rather than revise them in light of new evidence. We do this to avoid the discomfort that arises when we are confronted with facts that contradict our beliefs. These confirmation biases often lead to overconfidence.

Daniel Kahneman, a psychologist who won the Nobel Prize for Economics in 2002, had the following to say about our consistency bias and overconfidence: “The confidence we experience as we make a judgement is not a reasoned evaluation of the probability that it is right. Confidence is a feeling, one determined mostly by the coherence of the story and by the ease with which it comes to mind, even when the evidence of the story is sparse and unreliable. The bias towards coherence favours overconfidence.”

The problem is that in many of our organisations, confident individuals are regarded more highly than those who express doubt or uncertainty about the validity of their views. As the philosopher Bertrand Russell remarked, “The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubt.” In this context, people inevitably compete over who can express greater confidence in their views, regardless of whether those views are well-grounded or not.  

What can we do?

If decision-makers are often affected by all these cognitive biases, what can be done? Much of the popular literature in behavioural economics has tended to emphasise people’s irrationality in decision-making. This is not very helpful. “Irrational” has a pejorative connotation. When we say decision-makers are irrational, their response is likely to be one of denial and defensiveness.

But, more importantly, characterising people’s heuristics — or mental shortcuts — as irrational is not an accurate diagnosis. Heuristics are time-saving reasoning strategies that people employ to make judgements. They are not perfect, but then again, which shortcut does not have its limitations? The solution is not to abandon our heuristics altogether. Rather, it lies in being aware of our biases and heuristics, in knowing when they work and when they do not, and in finding ways to cope with their limitations. Below are some ideas. 

Power is dangerous - a bubble and a barrier - and wise leaders would do well to see it as a handicap, not a reward.

The first is to help people acquire experience so that they can use their reasoning strategies more effectively and reliably. If people in our organisations are vulnerable to misleading cues, or to confirmation biases, then we should help them form richer, explanatory models of the world. Helping people learn more accurate models and when to apply them is more likely to succeed than trying to change the way they think. People are often overconfident in their judgments, but this overconfidence diminishes as they become more accurate in their assessment of their own capabilities.

The second way is to force ourselves to take an outside view — for instance, by bringing diverse external voices to challenge our prevailing assumptions and models. Heuristics are particularly troublesome in times of disruptive change. For instance, companies with long, successful histories often have a well-established narrative about the reasons for their success. But over time, these stories can get simplified and lose their richness. Having people who can provide independent perspectives, ask critical questions, and suggest alternative narratives reduces the chances that we will be blinded to reality.

The decision scientist Gary Klein has another method for tempering our overconfidence, which he calls the pre-mortem technique. Unlike a post-mortem which seeks to find out why a project failed, a pre-mortem seeks to anticipate problems so that we can be realistic about the challenges. In a pre-mortem, you ask your team to pretend that your new venture has failed. The team members then write down all the reasons why it failed. We cannot expect to fix all the flaws, but we can prepare ourselves by anticipating some of the problems.

Finally, leadership and organisation cultures matter a great deal in determining whether our people are more or less susceptible to inaccurate biases. Organisations with more open, collaborative cultures that embrace debate and dissent are less likely to be blindsided by their leaders’ biases than those which are closed, secretive, hierarchical, and avoid conflict.

As Margaret Heffernan, author of Wilful Blindness: Why we Ignore the Obvious at our Peril points out, “It is always intrinsically difficult for leaders to know what is going on in their organisations, but never more so than when their personalities or egos quash debate and dissent... The more elevated [the leader’s] status, the less likely that anyone will dare to articulate an uncomfortable truth to them. Power is dangerous, a bubble and a barrier, and wise leaders would do well to see it as a handicap, not a reward.” 

This article was first published in HQ Asia (Print) Issue 03 (2012)

Comments
Opps! Please enter something in order to comment.

Preview Comment