Oblique
Many of the great mistakes of history, including the problems financial markets have continualy re-experienced, have been caused by a basic error of judgement – the idea that it’s possible to define, plan and control the outcomes of the world around us despite the rampant uncertainty we daily take in our strides. So instead of relying on expert judgement and feeling our way carefully towards outcomes we’ve found ourselves traduced by people with tunnel vision and a strong but unjustified confidence in their ability to navigate unerringly to a correct solution, whatever that might be.
This is the argument presented in a new addition to the popular literature on human decision making by the economist John Kay. At the heart of this book, Obliquity, are many arguments readers here will find familiar, but perhaps the most notable is the idea that those economists who argue that people are irrational are wrong. It’s the economists who misunderstand the nature of human decision making, not their subjects,
Not Direct
At the centre of Obliquity is the idea that we mostly don’t make decisions through some careful process of analysis – maximisation, or whatever term you want to apply to it – because the world is too complex to permit of such an approach in real life. This theory of direct decision making is not just wrong, but is also at the heart of some of the worst decisions in history, invariably made by people who thought that they knew what was right for everyone else.
Instead we end up with reasonable outcomes when we approach decision making obliquely – by using judgement and skill and, frankly, muddling our way through making the best of the situation as we find it on a day by day basis. The book gives example after example of corporations that have succeeded in making their shareholders very rich by setting themselves objectives that are nothing directly to do with wealth creation – and also shows how often direct attempts to generate wealth lead to the exact opposite outcome.
Mathematical Muddles
Kay’s particularly scathing of the management behind the kinds of mathematical model that underpins a lot of modern financial theory, especially the types of risk management model that underpinned a lot of financial speculation before the collapse of markets back in 2008. He’s well qualified to be sceptical of the quantitative modellers as he used to be one, helping set up and run a company that developed and sold these models to other corporations. As he relates, he often wondered why his customers didn’t really use their models until he realised that his company didn’t use them either.
Models of this type, properly constructed and carefully managed, can genuinely help thoughtful managers make decisions about how they should run their businesses. They’re not, however, substitutes for human decision making. Why this should be is beautifully explained by Kay in the middle sections of the book, showing how such models needs to be constantly scrutinised against reality in order to make sure that they continue to apply. As he points out much of the data used by corporations to manage risk merely shows that they are the survivors of a long cull – those that died out along the way won’t be the next ones to fail.
Franklin’s Gambit
Mostly, however, the modellers and their managers – who didn’t understand the models or the real world or their responsibility to marry the two together on a constant basis or indeed anything other than their end of year bonus check – used the models to simply justify their own intuitions. They fell guilty to Franklin’s Gambit, a behavioural error Kay names after Benjamin Franklin – the tendency of direct thinkers to firstly believe they know the right answers in an impossibly complex world and then to seek evidence to confirm this. As Franklin put it: “so convenient a thing it is to be a reasonable creature, since it enables one to find or make a reason for everything one had a mind to do”.
Or, basically, many of our political and business leaders make up their minds what they want to do and then go and find data to confirm their beliefs. In the worst cases they so control the data gathering process that it becomes impossible to even record disconfirming evidence. This is confirmation bias writ large and Kay shows how it’s led to some of the worst decisions of the last half century, from Vietnam to Enron.
Properly Rational
Should any of this seemingly be of comfort to those people who have an unswerving belief in the power of behavioural finance over the old-style maximising economics then Kay provides little to comfort them. Although the introduction of psychology into finance is a step in the right direction in acknowledging the difficulty of incorporating uncertainty into economics all too often the proponents of the new approach insist on following the old style belief that there’s a set standard of rational behaviour that people should aspire to. As we’ve seen before, however, this is the constrained and warped vision of rationality that economists think we should aspire to – a world in which being rational is to be consistent even if it means ignoring the basic human social norms built up over millions of years of evolution.
The middle chapters of Obliquity show that, unfortunately for the rationalists, there’s usually more than one correct answer to every real-world problem and whenever we act to solve a problem we change it. Expecting there to be a single, correct solution to any real-world issue in a terribly complex world where we can’t even define most of the problems is simply stupid even though we generally prefer leaders who present the world in these terms. When we develop models of the world we’re usually abstracting and simplifying and this leads to some terrible dangers when used by people who don’t know what they’re doing:
“In the first decade of the twenty-first century banks persuaded themselves that risk management could be treated like a problem that was closed, determinate and calculable … We and they learned that they were wrong. The most widely used template in the banking industry was called ‘value-at-risk’ (VAR)…
… The risk models that financial institutions use ensure that it is very unlikely that they will fail for the reasons that are incorporated into the models. This does not mean that they will not fail, only that if they fail it will be for some other reason. Which is, of course, what happened”.
Overflowing with Obliquity
Unless you’re of a mind set that desperately wants to see order in nature and impose structure on life there’s very little to dislike in Obliquity. Perhaps, if anything, there are too many ideas – every page is overflowing with insights and examples drawn from history, all expressed with the clarity that long-time readers of John Kay’s Financial Times articles have come to expect. It’s a book that bears re-reading even if you’ve a passable knowledge of the background, despite being a thin volume.
The lesson is that we shouldn’t put ourselves in the hands of people who believe that they have the answers to all of our problems. Of course, such people are often exactly the sort of people who want to be in charge and in whom we instinctively feel and express confidence. Sadly in a complex and uncertain world there are no straightforward answers, all we can do is muddle through one decision at a time, and we should be suspicious of anyone claiming to have a magic bullet. Especially as they usually don’t have a magic gun to fire it with.
The key is to know where we want to go without being foolish enough to think we know how we’re going to get there. To think obliquely is to accept and address the world as it really is. As he puts it, hopefully Kay’s book is a nudge in the right direction.
See more from the Psy-Fi Blog's Book Review page.
Related Articles: Ambiguity Aversion: Investing Under Conditions of Uncertainty, Quibbles With Quants, Unpredictably Rational
See more from the Psy-Fi Blog's Book Review page.
Related Articles: Ambiguity Aversion: Investing Under Conditions of Uncertainty, Quibbles With Quants, Unpredictably Rational
No comments:
Post a Comment