Most short-term opinions on markets or any system that
includes human beings as part of the machinery are generally worthless in a
financial sense. Mostly we can’t predict
what side of the bed our children will emerge from in the morning so why anyone
should expect to be able to accurately forecast the outcome of the interactions
of millions of people remains an abiding mystery.
Despite this reams of words are written each day by pundits
safe in the knowledge that today’s news is forgotten tomorrow and that
expressing unwarranted certainty is the way to succeed. They’ve learned that extreme, albeit
incorrect, precision will fool most of the people most of the time, and no one
ever checks.
Pundit Marketplaces
We’re especially attracted to people who express certainty
about the future. Since the future is
virtually unforeseeable these gurus are, at best, deluding themselves but they’re
tapping into our desire to believe that the world isn’t the nasty, brutish and unpredictable
place it really is.
The counter to this is that forecasters who are precisely
wrong will, eventually, be uncovered and revealed to be the fraudulent charlatans
they really are. This should be the
effect of the marketplace on ideas but unfortunately it turns out that we tend
to disregard feedback, which presumably is why there are thousands of media
pundits out there pushing their unsubstantiated opinions onto a gullible public,
safe in the knowledge that they can write or say pretty much anything they want,
because no one will ever hold it against them.
When Joseph Radzevick and Don Moore analysed peoples’
responses to overconfident investment judgements in Competing to be Certain (But Wrong) they noted that the preferred advisors were the ones that expressed
the most confidence that they were right – even though they were frequently
wrong – yet they didn’t suffer any reputational damage. This aligns with Philip Tetlock’s famous research
on political pundits that suggests the more famous the procrastinator the worse
their prediction accuracy (see: Expert Political Judgment: How Good Is It? How Can We Know?).
Overprecise Pundits
Radzevick and Moore suggest three possible explanations for
our credulousness in the face of certainty. One idea is that we simply forget
where our ideas come from, so that when things go wrong we don’t remember who
to blame. A second one is that the
advisor argues that they weren’t really wrong or that they were nearly
right. Finally, there’s the “this time
it’s different” gambit, where previous failures were a blip which will be fixed
under the latest set of circumstances.
Whatever the reason, extreme confidence in a pundit is a
quality that many people seem attracted to, and the hallmark of this is
overprecision. Moore and Healy identified three types of overconfidence in The Trouble with Overconfidence, which tend to get muddled in the
literature. The first type is overestimation
of our abilities, the second is overplacement where people believe themselves
to be better than the average and the third is overprecision, what the
researchers describe as “excessive certainty about the accuracy of one’s
beliefs”. Overprecision seems to be the
quality we’re most attracted to in gormless, incompetent and popular pundits
and advisors.
Hindsight and History
Overestimation is inherent in the human condition. When Baruch Fischhoff looked at the impact of
historical knowledge on judgement, in Hindsight ≠ foresight: the effect of outcome knowledge on judgment under uncertainty, he
discovered that participants significantly overestimated what they would have
known without prior knowledge and also what others did know without such
knowledge. The impact of this is a
failure to properly learn the lessons of history – we can’t possibly address
the real impact of uncertainty, that there are some things we can never predict,
unless we overcome the problem of overestimation.
So, hindsight bias leads to overestimation and means that we
fail to properly understand the lessons of the past; which in the case of
pundits means that they fail to recognize that they’re completely useless. As in so many biases no one is exactly sure
what it is that makes us so inclined to be fooled by overprecision, but one idea
is that it’s more valuable in communication – after all, there’s nothing so
annoying as someone who won’t put a figure on some important category.
Checks and Balances
Unfortunately a bias towards more precise information also
opens us up to being biased towards less truthful information. It’s not much good being precise in your estimate
of earnings if you’re utterly wrong about them but, oddly, in normal life people who are
confident in their predictions are usually reasonably accurate:
“When it is difficult to get accuracy data (as with predictions of the future) then the advisor’s own confidence that he or she has made the correct prediction may be the only clue available, and it may well be better than nothing. It may therefore be perfectly sensible for people to prefer confident advisors.”
This naturally occurring expectation that confidence plus precision equals accuracy may be the cause of our unwarranted attraction to clueless pundits. Of course, we should add our own estimates of the
likelihood of anyone being able to be precise about anything, and to develop
checks and balances to gauge the accuracy of such precision. Simply accepting a confident individual’s
assurances about the likely outcome of events that are fundamentally
unpredictable is about as sensible as relying on, say, a committed politician’s
evidence about the existence of weapons of mass destruction or an investment
guru’s expectations of indefinite 13% stockmarket growth.
However, But ...
In fact Radzevick and Moore propose that markets are likely
to exaggerate the impact of overconfidence amongst pundits. Our infallible attraction to overprecise and
overconfident estimates causes a kind of market failure, where advisers with
these attributes continue to attract fame and fortune despite their manifest
failures, driving out more cautious opinions.
In short, we get the pundits, gurus and advisers we merit, and we often
pay a full price for it.
Obviously we should seek out more cautious advisers, and try
to avoid people who provide spuriously accurate estimates. Challenging the numbers is always a
worthwhile exercise, although you’ll find it hard to get closure. If it’s your money at stake then you shouldn’t
care, just avoid them. Someone who constantly expresses overprecise expectations is a danger, and the only person they'll be generating money for is themselves.
However, there is a way to identify more cautious analysts
through simple analysis of their texts.
They tend to use the words “however” and “but” a lot: they hedge their
bets, which makes them less attractive to people who don’t care to think for
themselves, but at least tells us that they’re aware of their own
fallibility. But you might want to read
the research for yourselves, rather than just taking my word for it.
just ignore them all. easy isn't it?
ReplyDeleteWonderful piece again Psyfi. You'll definitely get a mention in my forthcoming book.
ReplyDeleteAs you say - this marketing of the confident view is all bound up in our own overconfidence and the worship of others who are likewise.
My recommendation is to listen to the quieter – more thoughtful folk Their few words are generally worth a lot more than the pundits. Or as a good friend of mine (an Actuary) says - why use ten words when none will do.
I compared the error estimation technique use in science with the decimal delusion of EPS forecasts
ReplyDeleteThe Delusion of Decimals
MrC, that was genius. I wish I'd thought of it :)
ReplyDeleteGreat article - but shouldn't the last sentence read "But [,HOWEVER] you might want to read the research for yourselves, rather than just taking my word for it."
ReplyDelete;-)
I thought starting one line in the last paragraph with "However" and another with "But" was enough. Good spot, though!
ReplyDeleteThe preference for overconfident, overprecise advisors seems like just another example of the simple fallacy of confusing cause and effect. The thinking goes: since a true expert would offer precise advice and a non-expert would hesitate, the precise advisor must be an expert. Which is wrong in general and even more so in areas in which expertise is hard to measure, of course.
ReplyDeleteI doubt that sophisticated decision makers fall into this trap (in fact, overconfidence from an advisor is a tell-tale sign for caution), but untrained dummies most likely do. The "pundits" to which you refer are financial journalists/advisors/analysts whose crappy ideas are addressed to Joe Shmoe, so it all fits the picture.