The Psy-Fi Pages

Thursday, 10 March 2016

Less Is More

Error, Human

Much market analysis operates on the assumption that more data is better – more data leads to more accurate results. More data may require more complex processing, leading to greater and greater requirements for computing power but, in principle, the idea is that more is better.

Out in the real world, however, we don’t have the luxury of this kind of analysis. This leads to errors which sometimes we call biases. But surprisingly it also, often, leads to better results. It may just be that the reason we make so many mistakes is because we’re trying to process too much information, not because we’re naturally error prone.

Math Good, Instinct Bad

There’s a type of snobbery that’s grown up around thought processes: logical or statistical analysis is good, gut instinct is bad. But, by and large, we’ve managed to be a fairly successful species without the majority of us being able to figure out the math behind, well, anything. Of course, the reality of everyday life is that we don’t have the luxury of carefully computing every decision so the fact we have ways of making quick and snappy choices isn’t accidental, it’s deliberate: we're designed that way.

Now the idea that we don’t always make analyzed choices based on all the available information is often presented as sub-optimal behavior; often referred to as cognitive biases. We’re biased, and illogical, because we don’t perform the type of analysis that a bunch of academics think we should. And, of course, the word “bias” is pejorative – it implies that we’re doing something wrong, the idiot bunch of badly shaved apes that we are.

But this is a ridiculous standard to hold people to. We make thousands of decisions every day, it simply doesn’t make any sense to carefully analyze every single one of them. We need short-cuts to cut the load and we call these heuristics, basic rules of thumb that allow us to make good enough decisions to get through the days and weeks and months that constitute our lives.

Good-Enough Not Good Enough?

This split between “good” logical analysis and “bad” biased heuristics can be traced back to the work of the founders of the research into behavioral biases. On one hand we had Amos Tversky and Daniel Kahneman propounding a logical standard of behavior that later morphed into the two-system theory of reasoning – the intuitive System 1 and the logical System 2. On the other we had Herb Simon asking questions not about how we should reason, but about how we do reason – a concept we now refer to as bounded rationality.

This difference may seem a bit esoteric but it actually turns out to be fundamental to how we do basic analysis. The logical analysis approach assumes that we assess each and every option, assign probabilities to it and then select the best option. The good-enough reasoning approach assumes that we only look at a few options, based on some mental short-cut, and then choose the one we feel is best.

Here’s the rub, though. The underlying assumption is that even if we actually make decisions using good-enough heuristics we shouldn’t, especially for choices we can make in slow time – stuff like investing, for instance. The gold standard here is that we collect as much information as we can, apply rigorous logic and come to the optimal solution.

Tied

Yet when researchers actually look at the performance of logical analysis versus good-enough heuristics it turns out that the results aren’t quite what we’d expect. Marcus Wübben and Florien Wagenheim looked at how purchasing managers decide whether or not customers are active, and forecast future revenues based on these decisions. On one hand there are sophisticated Bayesian models, and on the other there are the rough rules of thumb that managers normally use – in this case something called the hiatus heuristic: if a customer has not purchased for more than a certain number of months, he or she is considered inactive; otherwise, he or she is considered active.

The result was a clear tie: the sophisticated models were no more accurate that the biased rules of thumb:
“We find no clear evidence for the superiority of these models for managerially relevant decisions in customer management compared with simple methods that our industry partners used”.
Underlying the argument from the heuristics movement is that to look at cognitive processing alone is simply stupid. You have to look at the environment that the decision is being made in, as well. Herb Simon expressed this as well as anyone
“Human rational behavior (and the rational behaviour of all physical symbol systems) is shaped by a scissors whose two blades are the structure of task environments and the computational capabilities of the actor”

Or, the circumstances in which we make decisions determines the way we make them and the conclusions we come to – an issue generally known as ecological validity. The point being that experiments carried out in the laboratory showing that humans are “irrational” and “biased” are only telling us something about how we operate in the laboratory, we can’t know for sure that these behaviors translate into the real-world.

Sunk Costs

The classic heuristics and biases experiment, the one where most of the behavioral economics research started, was the original Tversky and Kahneman  experiment that demonstrated loss aversion – we are risk averse in the presence of a gain and risk seeking in the presence of a loss. This is the sunk cost fallacy: we are not rational about sunk costs, we don’t judge our investment decision on the current situation but are framed by our history and experience. 

At the time this was a shock for economists who believed that we were rational actors as defined by their definition of rationality, a definition that more or less still runs the world. But when John List looked at loss aversion in a field experiment involving the trading of sporting memorabilia – so essentially a naturally occurring environment – he discovered that while loss aversion did occur for inexperienced traders it didn’t for experienced ones.

Of course, in the laboratory, everyone was a novice, so the original findings were correct, but incomplete.  People do learn and their behavior is different in different environments. Experienced investors don’t suffer as much from loss aversion as inexperienced ones.

The Equality Heuristic

In fact, as discussed in this paper by Gerd Gigerenzer and Wolfgang Gaissmaier on Heuristic Decision Making, heuristics can be more accurate than more complex “rational” methods. This, the so-called less-is-more effect, is not rational or irrational – but whether it works or not will depend on the environment. What fails in the lab make work on the trading floor.

In fact the critical thing seems to be that people need to learn which heuristics to use in which situations. The basic rules of thumb we use to determine what to invest in may change as market conditions change. People who are experienced at trading in stable conditions may find themselves reduced to novices in those rare conditions of uncertainty – periods like the oil crisis in the 1970’s or the financial crisis in 2007. As it turns out a very simple 1/N asset allocation rule beats most complex alternatives – a finding we discussed in the Turkey Illusion. A 1/N strategy – the equality heuristic – is a pretty good way of using a less-is-more effect to decide how to diversify.

Be Rational, Do Nothing

So, overall we need to be careful in trying to eliminate our biases. Gathering vast amounts of information is not likely to be of much use to our decision making processes. On the other hand developing a quick set of basic filters in order to eliminate most investment choices is pretty much exactly what our brains do on a daily basis, and not the worst approach by a long way.

The other takeaway from this, though, is that when we find ourselves in unfamiliar circumstances we should be very careful about applying the heuristics that have served us well in other situations. Many good professional managers will simply sit on their hands during market upheavals, reasoning that if they can’t predict what’s going to happen next then making any decision is simply guesswork.

We’re not really irrational or biased; but sometimes our rules of thumb lead us astray when we apply them to situations they’re not designed for. Sometimes the only rule we should follow is to do nothing – but often that’s the hardest rule of all to follow.

3 comments:

  1. Just looking for a few good heuristics, otherwise known as proverbs. A word to the wise is sufficient.

    ReplyDelete
  2. In my opinion, this is one of your best articles.

    ReplyDelete
  3. I teach a section on Behavioral Finance to my investment students. I encourage them to read your blog. In this article, just one comment on the 1/N strategy. The components of the N have to be diversified; I'm referring to U.S. 401-k programs that offer many equity funds and no bond or few bond funds. In those cases, 1/N doesn't result in diversification. I'm being picky. I really enjoy your blog.

    ReplyDelete