No UFOs Here
It has come to our attention that there are amongst you those who are quite happy to accept that behavioral biases affect the way that other investors act but refuse to agree that you, yourself are so afflicted. Of course, most UFO abductees reckon everyone else is a faker, so we shouldn’t be too surprised at this.
There’s a term for this wilful foolishness: it’s called the bias blind spot. We recognise it in others, so why don’t we see it in ourselves?
Black Box People
For quite a long time psychologists operated on the basis that the only thing of interest about people was the way they behaved externally. While not exactly denying that people had an inner life the general approach was that this wasn’t measurable and therefore wasn’t interesting. Which was a shame, because it’s probably the most interesting thing about us, once you exclude reality TV show contestants, who don't have an inner life but, if they did, would sell it to the highest bidder.
In particular, each of us knows ourselves better than they know anyone else. We uniquely have access to our inner thoughts, aspirations and desires. We can extrapolate from these introspections to what we think another person’s inner life might be like, and if we’re particularly good at this we’re said to demonstrate empathy. Unsurprisingly, though, we’re nearly always quite sure that we understand what we think and what we want better than anyone else.
Unfortunately this isn’t actually true. Quite a lot of the time, it appears, we don’t actually know what we want, and from this we can be fairly sure that we don’t know what we think either. As described in Buy.ology when peoples’ preferences for new TV shows were analysed by wiring up the viewers to inspect their brain patterns and then comparing the results to what people actually said they preferred it turned out that brainwaves were a much better indicator of their preferences than their beliefs.
Electrifying Introspection
The fact that we don’t know what we think isn’t readily apparent to us: we don’t generally recognise this, so when we read about how other people have fallen foul of some smart alec researchers to demonstrate some behavioral flaw or other our introspections are faulty. Invariably we think that we would be resilient to the biases demonstrated – that we are special. And, of course, we are special – we know what we think and feel: only we don’t.
When we read about the famous experiments of Stanley Milgram, in which participants carried on electrocuting people even though they were screaming for them to stop, we invariably think we would disobey the instructions of the white coated authorities. Some of us might have done, but the evidence tells us that most of us wouldn’t. The majority of us are suffering from a failure of introspection.
Positive Self-Image
This type of failure seems to be a feature of the way that we handle questions requiring us to compare ourselves to others and is made explicit in some of the most straightforward ways. Paul Windschitl, Justin Kruger and Ericka Nus Simms looked at how people's optimism varied within competitions. What they found was that:
"The average optimism of a set of competitors increases in the face of a shared benefit and decreases in the face of a shared adversity."Of course, if a competition is easier for me it's easier for you too - I shouldn't be more optimistic that I'll succeed. In fact when the instructor added points to every student's score people thought this would give them a better chance of finishing in the top 50% of the class. Which, of course, doesn't make any sense, not that that should come as a surprise. A similar finding was discovered by Don Moore and Tai Gyn Kim, who showed that people will bet more on a simple quiz than a difficult one, despite the fact that it's easier for everyone. Basically we seem to think that shared difficulties or benefits will affect us more than other people. Moore and Kim think this is a focusing illusion, the sort of thing we looked at in Money Can't Buy You Happiness, where people think about themselves without considering the relative impacts on other people. Essentially, we introspect and consider that the impacts on ourselves are more important than on others.
Confirmation Only, Please
A second factor seems to be that when we’re asked a question we immediately start looking for evidence that confirms our expectation about the answer. It’s only if we can’t find any such evidence that we start looking for evidence to disconfirm our expected answer. It appears that looking for confirmatory evidence is easier for our brains – which makes sense, if our minds are actually networks of related ideas and concepts, because finding something that is linked to a particular concept is going to be easier than finding something that isn’t. This also explains confirmation bias in general, because we’ll preferentially look for confirmatory evidence and, if we find some, probably won’t bother engaging in the more laborious task of finding disconfirming evidence – it’s hard, and why bother?
Taking these issues into account it’s entirely reasonable to assume that in some circumstances our ability to introspect will actually get in the way of accurate assessments of our capabilities – our positive self-image will bias any searches for information confirming our strengths in the face of biases that other people fall foul of. As Gilovich and colleagues have shown in tests of altruism, people’s expected performance and self-assessment of their performance in introspection tests actually match – but their actual performance falls below these self-measures and is more accurately assessed by impartial observers.
Introspection Illusion
This implies, of course, that behavioral biases operate non-consciously, and that we simply aren’t aware of them. This is exactly the hypothesis that Emily Pronin and Matthew Kugler have investigated. Their research suggests that the bias blind spot is caused by an introspection illusion: we value information gleaned from introspection more than we value our actions when we assess our own biases. We apply different standards to others than we do to ourselves:
“Participants showed a self—other asymmetry in how they defined bias. They were more likely to define bias in terms of a thought, feeling, or motive, as opposed to an action, when primed to think about themselves rather than another person”
In fact the research shows that people argued that they were right to ignore the way they actually behaved, compared to other people, because of their insights into their own introspections about how they felt. This is a mental version of “do as I say, not as I do”.
The application of the bias blind spot to investors is hopefully clear – and if it isn’t then you might helpfully read One Long Argument: A Big List of Behavioral Biases while repeating the mantra “this applies to me”. Fortunately the research shows that people who can be made to acknowledge their bias blind spot can act to overcome it. Which is fortunate, because if that wasn’t true I’d have been wasting my time for the past few years.
Related articles:
- Disposed to Lose Money
- The Neuroeconomics Revolution
- A Sideways Look at ... Behavioral Bias
- Money Can't Buy You Happiness
- Disconfirm, Disconfirm, Disconfirm
- Confirmation bias, the Investor's Curse
No comments:
Post a Comment