Join our community of smart investors
Opinion

Not seeing

Not seeing
April 6, 2017
Not seeing

For example, if you want to know the probability of a plane crashing, you must know the thousands of instances of planes landing safely, as well as the more attention-grabbing handful that don't. And in considering correlations and causality we must remember instances where one thing happened without the other, as well as when both occurred: I suspect the erroneous fear that the MMR vaccine caused autism arose in part because people looked at a few instances where the administration of the vaccine was followed by a diagnosis of autism and overlooked the millions of instances when it wasn't.

You might think all this should be obvious. But investors make similar mistakes. They neglect the evidential value of what isn't so obvious. For example, star fund managers get lots of attention while thousands of those who underperform do not. This leads people to overestimate the chances of beating the market. And stocks that grow a lot get lots of attention, which leads people to neglect the fact that most stocks, over their lifetimes, under perform cash. This causes people to overestimate their chances of picking a great stock. And Charlie Cai at Liverpool University has shown how investors pay too much attention to companies that have grown a lot while neglecting the tendency for rapid growth to depress profits, with the result that they pay too much for growth stocks.

It's not just amateurs who underrate the importance of the unseen. In 2007 Fred Goodwin, chief executive of RBS, overestimated his ability to manage takeovers and neglected evidence that takeovers often fail. That led him to buy ABN Amro, which was perhaps the most expensive mistake ever made in the UK. And at the same time many banks bought mortgage derivatives because their risk management systems oversampled good stable times and neglected the threat of a downturn.

Some recent experiments by Harvard University's Benjamin Enke show just how widespread is the tendency to neglect unseen but vital evidence. He asked people to guess a number which was the average of 15 draws from the set: 50, 70, 90, 110, 130 and 150. Subjects were told one of these draws and split into two groups, one containing those who got a signal greater than 100 and the other who had got a signal less than 100. They were then invited to ask others which number they'd been told, with the quirk that if the other person was in the opposite group to them, they got no answer.

In this set-up, it's obvious that silence is evidence. If you drew a high number, no answer is evidence of a low number which means you should revise down your guess at what the true number is. However, Professor Enke found that most people ignored this evidence. "People do not pay attention to or notice what they do not see in the first place," he says.

This was not because they lacked incentives: they were paid for accurate guesses. Nor was it because they were generally stupid: Professor Enke found that they made good guesses on the basis of information they did receive.

All this suggests that complexity narrows our concentration. When we have a difficult problem, we focus upon it. But focus, by definition, is narrow. It therefore leads us to neglect things.

In this sense, Professor Enke's work corroborates a more famous experiment done at Harvard. Christopher Chabris and Daniel Simons showed people film of some students playing basketball and asked them to count the number of passes. At the end of the film they asked their subjects if they had seen anything unusual. Half of them said not, having failed to spot that a woman in a gorilla suit had walked in front of them. "When our attention is focused on one thing, we fail to notice other, unexpected things," they say.

There's an analogy here with stockpicking. If we focus upon finding the next great growth stock we look for all sorts of encouraging signs but neglect background information such as the tendency for most stocks to do badly. This is the base-rate fallacy.

High stakes can actually compound this problem. In his book Drive, Daniel Pink shows how incentives can reduce people's ability to think laterally. "Rewards, by their very nature, narrow our focus" he says. The banking crisis might be evidence of this: big bonuses encouraged a neglect of important evidence of risk.

There are at least two points here. One is that we should remember about any issue that the true evidence base if often wider than we think: it comprises the unseen as well as the seen. It's useful to ask: what are we missing here? What isn't being said? What hasn't happened?

Secondly, it is prodigiously difficult to think clearly even when we have incentives to do so - sometimes, especially when we have incentives. This is one reason why so few of us do so. (And I mean us: a big reason why I use tracker funds is that I don't trust my own judgement).

Oh, and one other thing. Professor Enke also found that those who neglected evidence and so gave biased estimates were as confident about their answers as those who were more accurate. Confidence, then, is no indicator of rationality. You will all have your own examples of this.