The Psy-Fi Pages

Wednesday, 3 February 2016

HARKing Back: Lessons in Investing from Science

Confirm Ye Not

Here's what ought to be a really boring idea - we need scientists in general and psychologists and economists in particular to stop hypothesising after results are known (HARKing, geddit?). Instead they need to state what they're looking for before they conduct their experiments because otherwise they cherrypick the results they find to confirm hypotheses they never previously had.

The underlying problem is our old foe, confirmation bias. And the solution for scientists and social scientists alike is known as pre-registration. It would be no bad thing for investors to demand a similar process for fund managers and financial experts. Or, for that matter, to apply some of the ideas to their own investing strategies.

No No Negatives

It's been known for years that a lot of scientific research isn't very reliable. There are numerous problems, chief amongst them being the non-publication of negative results: an issue known as publication bias. There's no kudos in showing that your hypotheses were wrong, so researchers and corporations tend to bury the data, but it's still valuable information that should be shared: scientists see further by standing on the shoulders of others, we shouldn't be encouraging them to shrug them off because they've got bored.

Worse still, though, is the fact that many studies turn out not to be replicable. The ability to re-run an experiment and produce the same result is an absolute cornerstone of the scientific method: science works because it's not built on faith, it's constructed out of evidence. If it turns out that the evidence is unreliable then what's being done isn't science, it's more like religious studies with instruments. 

Or economics.

Repeat, Again

Once we move to the social sciences then the problems are even worse. Human beings are terrible things to experiment on, being inclined to change their minds, develop opinions about the experiments and to second-guess what the researchers would like them to do, just to be nice. 

All too many experiments in the social sciences turn out to be flawed because of social or situational factors that didn't seem important at the time. Given this you'd think that repeating experiments to make sure the results held would be even more important for psychologists than it is for researchers in the hard sciences.  Well, guess again.

According to research by Matthew Makel, Jonathan Plucker and Boyd Hegarty only a little over 1% of psychology studies have ever been replicated. Everything else is simply a matter of faith in the integrity and lack of bias of the original researchers. Which is not science: in the words of John Tukey, quoted at the head of their paper:
"Confirmation comes from repetition. Any attempt to avoid this statement leads to failure and more probably to destruction."
Pre-Register

The best solution to this we've yet found is known as pre-registration: studies have to be registered in advance, and the hypotheses under investigation stated up front before the research is done. This prevents the experimenters from looking at their data after the event and picking out interesting positive correlations which they didn't control for, but which are likely to get published.

Where pre-registration has happened the proportion of studies giving positive results has fallen dramatically: analysis of studies into treatments for heart disease have shown a frightening drop in positive results since pre-registration was mandated:
"17 out of 30 studies (57%) published prior to 2000 showed a significant benefit of intervention on the primary outcome in comparison to only 2 among the 25 (8%) trials published after 2000".
Some of this may be because the low-hanging fruit on the subject was been picked earlier, but it's a scary result all the same. It seems likely that because the researchers can no longer consciously or unconsciously pick the results they prefer they remove the possibility of confirmation bias - and the fall is so dramatic it places the previous results in question. And, of course, it's not clear how many of those have been replicated.

Creative Scientists

Pre-registration isn't universally popular: there is much rending of white coats and grinding of molars over the issue. Opponents argue that it risks putting scientists in a creative straight-jacket. Although when respectable peer reviewed journals start publishing papers alleging the existence of extra-sensory perception based on ...
"Anomalous processes of information or energy transfer that are currently unexplained in terms of known physical or biological mechanisms"
... then you have to wonder whether the creative juices maybe need a touch of reduction - oh, and the results of this experiment don't seem to be replicable, bet they never saw that coming. 

So, what other group of people do we know who are given to making ad-hoc hypotheses, investing loads of money in them, and then ignoring the results while cherrypicking specific successes in order to publicly claim that they were successful? OK, apart from politicians.

Investing Feedback

Investors have all of these faults, and a few more. If we truly wanted to become better investors then we'd pre-register our hypotheses - including our expected timescales - and then measure our results against the results. Doubtless the outcome would frequently be embarrassing, but the evidence that we do have suggests that getting real feedback about our performance is the only way to improve predictive capability in complex systems like the stockmarket (see: Depressed Investors Don't Need Feedback. Everyone Else Does).

The other thing this would do would be to force us to face up to the reality that we can be successful by luck and can fail through no fault of our own. In complex adaptive systems we simply cannot predict every possible situation, we can only hope to be able to predict a little better than average. But a little better is enough to make a turn, so every percentage point improvement we can make is worth it. 

Commit and Document

So I wonder if some enterprising developer out there fancies setting up a pre-registration website for investors keen to improve their returns, rather their personal status? Public commitment backed up by a positive rewards system has been shown to produce powerful results in a whole variety of situations. For example, in Tying Odysseus to the Mast: Evidence from a Commitment Savings Product in the Philippines, Nava Ashraf, Dean Karlan and Wesley Yin showed:
"Commitment-treatment group participants have a 12.3 (9.6) percent higher probability of increasing their savings by more than 20 percent after six (twelve) months, relative to the control group participants, and an 11 (6.4) percent higher probability of increasing their savings by more than 20 percent, relative to the marketing group participants. The increase in savings over the twelve months suggests that the savings response to the commitment treatment is a lasting change, not merely a short-term response to the new product"
I suspect that even a non-financial reward system based on peer support would facilitate uptake.

HARK, hear ...

Avoiding HARKing is the future of the hard and the soft sciences. And, by analogy, as investors if we don't have hypotheses about what we're investing in then we're simply the modern equivalents of astrologers. And, if we have hypotheses, we should write them down and test whether they're right, not simply crow about the random successes and ignore the equally random failures. 

It's worrying, of course, that this isn't already the basic investing process. But to be honest it's even more worrying that it doesn't seem to be the basic scientific process. Genius and creativity has its place in all human activity - Kepler came up with his third law of planetary motion by mapping orbits to harmonic ratios, believing these to be a sign of heavenly perfection. But Kepler was a mad genius who happened to be correct, so here's my hypothesis: relying on mad geniuses for humanity's future and your family's well-being is probably not prudent.

Publication bias added to the Big List of Behavioral Biases

  

No comments:

Post a Comment