God and the Economists
Apparently the Monty Hall test is one that Wall Street firms commonly use to see if potential applicants are suitable material for the madness that is a trading desk in the heartlands of financeland. In so using the test these corporations are, knowingly or not, addressing some of the most hard to overcome behavioural biases that often defeat legions of unprepared investors.
In fact, this particular test has been a source of delight for grant seeking academics of all persuasions. The thing is that the Monty Hall test can be explained in many different ways, most of them simultaneously contradictory. However, it does increasingly look like this blatantly irrational behaviour is hard-wired into our neurons. It seems God really doesn’t like economists.
Stick or Twist?
For those who don’t know – and it’s hard to believe that there are many people left in that category – the Monty Hall problem is usually described in terms of a twentieth century gameshow hosted by the eponymous Monty. In the game contestants were given a choice of three closed doors two of which hid a goat and the other a car. Once the contestant had chosen a door Monty would throw open one of the other two ... to reveal a goat. The question is, on the assumption they prefer the car to the goat, should the contestant now stick with their original choice or twist, and select the other closed door?
The answer is that they should change their mind. This answer usually generates a whole host of angry responses, often from mathematicians and statisticians, but is correct under most reasonable assumptions. Basically the reason is that when you make your original choice the door you choose has a one in three chance of being the car while there's a two in three chance of it being behind the other two. When Monty opens his closed door your door’s odds don’t change, and neither do the odds on the combination of the other doors.
Disbelievers, Repent
Most of us – me included – initially find this unbelievable and spend hours frantically manipulating card decks in a furious attempt to show that this is incorrect. But it isn’t, and the key to understanding this is that Monty already knows where the car is, and his knowledge changes the odds. It’s an important lesson for stockpickers when they’re dealing with people who know more than them.
Now the thing about this behaviour is that it is strictly irrational because once you sit down and figure out the probabilities it’s clear that the optimal strategy is to switch. Yet multiple experiments in many situations have repeatedly shown that virtually everyone fails to figure this out and, even worse, they don’t actually improve their performance with experience. In fact, even people trained with transparent doors largely failed to adapt their behaviour when faced with the normal problem.
Explanations, Explanations
It’s fairly easy to see why this has attracted so much research attention. If these behaviours – an inability to logically reason about a relatively straightforward problem and an inability to adapt to overcome this inability – transfer into other areas of human activity then they suggest profound implications. In particular, if people can’t figure out a basic probability conundrum in a real-life situation because they can’t reason about the impact of someone else with more knowledge than them then you can’t hold out much hope for the average dealer in stocks.
A whole host of explanations for this result have been put forward, some of them downright contradictory. The first one is that they simply can’t perform the calculation to update the conditional probabilities after the first door's opened – which they should be able to do if we’re rational maximisers as much of economics has insisted for a century or so. This failure of so-called Bayesian updating, after Bayes’ Theorem which describes how to calculate conditional probabilities, would imply that we’re not very good at rationally reasoning and would suggest that our rationality problems stem from the difficulty we have in making these updated calculations. Note that this wouldn’t mean we aren’t fundamentally rational, it would just imply difficulties in performing the necessary calculations.
The other explanations rely more on the concepts of behavioural psychology. So this could be an outcome of the illusion of control, with participants believing that they’re exercising skill and that the sight of an empty door merely shows that they’ve already chosen correctly. Or it could be a side-effect of the status quo effect where people are markedly reluctant to give up something they already own in exchange for something else – probably because they’d feel worse if it turned out what they’d given up was more valuable than what they’d accepted. Sins of commission are always felt more than sins of omission.
Bad Decisions or Bad Design?
As usual, in these types of anomalies, there’s no agreement that the underlying effect is caused by behavioural biases over poorly implemented rationality because it’s pretty well impossible to tell the difference. Importantly, these are not the same, because poorly implemented rationality can be overcome while hard-wired behavioural bias probably can’t; and that makes the world of difference.
In Monty Hall’s Three Doors For Dummies Andrea Morone and Annamaria Fiore produced an experimental design that controlled for Bayesian updating by introducing a Monty Hall problem that didn’t need it. They reasoned that if they could make clear that the choice was between a one third probability of the chosen door being correct and a two thirds combined probability of the other doors then they ought to be able to make people aware of the real odds. So instead of opening one of the doors after the choice was made they simply offered the participant the option of keeping their current door or choosing both of the other doors. It’s the same choice of course, but makes obvious that the odds of changing are the better ones.
Behavioural, Not Bayesian
Why is this important? Well, in this new formulation of the Monty Hall problem, no new information is presented to the participant. No door is opened so no recalculation of Bayesian conditional probability is required. The odds haven’t changed. If participants still insist on keeping their original choice it’s because they’re barking mad, or at least behaviourally challenged. So what happened?
Well, basically the new option led to more people switching and an overall improvement in the learning process, but not to the extent that you'd expect if this was a Bayesian problem. The researchers opined that failures of Bayesian updating don’t seem to be a major factor here. More likely you’re looking at some kind of facet of the status quo effect. A word of caution on interpreting these results, however – it’s a genuinely ingenious study, but it’s on a very small sample and it needs further work.
Still, at root, it takes us further from rational economics and more towards behavioural imbalances. If people can’t stop behavioural biases preventing them from solving relatively simple problems like this then what chance do they have of managing themselves in the maelstrom of markets? As usual what Monty Hall proved is that it’s better to be opening the doors when you know what’s behind them than guessing blindly and stupidly when you don’t.
Related Articles: Regret, The Lottery of Stock Picking, Pricing Anomalies: Now You See Me, Now You Don't
Apparently the Monty Hall test is one that Wall Street firms commonly use to see if potential applicants are suitable material for the madness that is a trading desk in the heartlands of financeland. In so using the test these corporations are, knowingly or not, addressing some of the most hard to overcome behavioural biases that often defeat legions of unprepared investors.
In fact, this particular test has been a source of delight for grant seeking academics of all persuasions. The thing is that the Monty Hall test can be explained in many different ways, most of them simultaneously contradictory. However, it does increasingly look like this blatantly irrational behaviour is hard-wired into our neurons. It seems God really doesn’t like economists.
Stick or Twist?
For those who don’t know – and it’s hard to believe that there are many people left in that category – the Monty Hall problem is usually described in terms of a twentieth century gameshow hosted by the eponymous Monty. In the game contestants were given a choice of three closed doors two of which hid a goat and the other a car. Once the contestant had chosen a door Monty would throw open one of the other two ... to reveal a goat. The question is, on the assumption they prefer the car to the goat, should the contestant now stick with their original choice or twist, and select the other closed door?
The answer is that they should change their mind. This answer usually generates a whole host of angry responses, often from mathematicians and statisticians, but is correct under most reasonable assumptions. Basically the reason is that when you make your original choice the door you choose has a one in three chance of being the car while there's a two in three chance of it being behind the other two. When Monty opens his closed door your door’s odds don’t change, and neither do the odds on the combination of the other doors.
Disbelievers, Repent
Most of us – me included – initially find this unbelievable and spend hours frantically manipulating card decks in a furious attempt to show that this is incorrect. But it isn’t, and the key to understanding this is that Monty already knows where the car is, and his knowledge changes the odds. It’s an important lesson for stockpickers when they’re dealing with people who know more than them.
Now the thing about this behaviour is that it is strictly irrational because once you sit down and figure out the probabilities it’s clear that the optimal strategy is to switch. Yet multiple experiments in many situations have repeatedly shown that virtually everyone fails to figure this out and, even worse, they don’t actually improve their performance with experience. In fact, even people trained with transparent doors largely failed to adapt their behaviour when faced with the normal problem.
Explanations, Explanations
It’s fairly easy to see why this has attracted so much research attention. If these behaviours – an inability to logically reason about a relatively straightforward problem and an inability to adapt to overcome this inability – transfer into other areas of human activity then they suggest profound implications. In particular, if people can’t figure out a basic probability conundrum in a real-life situation because they can’t reason about the impact of someone else with more knowledge than them then you can’t hold out much hope for the average dealer in stocks.
A whole host of explanations for this result have been put forward, some of them downright contradictory. The first one is that they simply can’t perform the calculation to update the conditional probabilities after the first door's opened – which they should be able to do if we’re rational maximisers as much of economics has insisted for a century or so. This failure of so-called Bayesian updating, after Bayes’ Theorem which describes how to calculate conditional probabilities, would imply that we’re not very good at rationally reasoning and would suggest that our rationality problems stem from the difficulty we have in making these updated calculations. Note that this wouldn’t mean we aren’t fundamentally rational, it would just imply difficulties in performing the necessary calculations.
The other explanations rely more on the concepts of behavioural psychology. So this could be an outcome of the illusion of control, with participants believing that they’re exercising skill and that the sight of an empty door merely shows that they’ve already chosen correctly. Or it could be a side-effect of the status quo effect where people are markedly reluctant to give up something they already own in exchange for something else – probably because they’d feel worse if it turned out what they’d given up was more valuable than what they’d accepted. Sins of commission are always felt more than sins of omission.
Bad Decisions or Bad Design?
As usual, in these types of anomalies, there’s no agreement that the underlying effect is caused by behavioural biases over poorly implemented rationality because it’s pretty well impossible to tell the difference. Importantly, these are not the same, because poorly implemented rationality can be overcome while hard-wired behavioural bias probably can’t; and that makes the world of difference.
In Monty Hall’s Three Doors For Dummies Andrea Morone and Annamaria Fiore produced an experimental design that controlled for Bayesian updating by introducing a Monty Hall problem that didn’t need it. They reasoned that if they could make clear that the choice was between a one third probability of the chosen door being correct and a two thirds combined probability of the other doors then they ought to be able to make people aware of the real odds. So instead of opening one of the doors after the choice was made they simply offered the participant the option of keeping their current door or choosing both of the other doors. It’s the same choice of course, but makes obvious that the odds of changing are the better ones.
Behavioural, Not Bayesian
Why is this important? Well, in this new formulation of the Monty Hall problem, no new information is presented to the participant. No door is opened so no recalculation of Bayesian conditional probability is required. The odds haven’t changed. If participants still insist on keeping their original choice it’s because they’re barking mad, or at least behaviourally challenged. So what happened?
Well, basically the new option led to more people switching and an overall improvement in the learning process, but not to the extent that you'd expect if this was a Bayesian problem. The researchers opined that failures of Bayesian updating don’t seem to be a major factor here. More likely you’re looking at some kind of facet of the status quo effect. A word of caution on interpreting these results, however – it’s a genuinely ingenious study, but it’s on a very small sample and it needs further work.
Still, at root, it takes us further from rational economics and more towards behavioural imbalances. If people can’t stop behavioural biases preventing them from solving relatively simple problems like this then what chance do they have of managing themselves in the maelstrom of markets? As usual what Monty Hall proved is that it’s better to be opening the doors when you know what’s behind them than guessing blindly and stupidly when you don’t.
Related Articles: Regret, The Lottery of Stock Picking, Pricing Anomalies: Now You See Me, Now You Don't
Wasn't aware of the "...Dummies" thing. Very simple. This is the first time I've been able to think this through without getting a headache.
ReplyDeletepoorly implemented rationality can be overcome while hard-wired behavioural bias probably can’t; and that makes the world of difference.
ReplyDeleteThis blog entry is so good it's scary. This gets right to the heart of what the investing project is all about (in my view, of course).
It may be just that I am a willful optimist, but I am more optimistic than the words quoted above would suggest is appropriate. It's more than poorly implemented irrationality we are dealing with. It really is a hard-wired behavioral bias. But I don't get why so many throw up their hands when they come to that conclusion. It's a good thing to know what we are up against. It tells us what sort of fight it is that we need to be fighting.
Say that the Monte Hall problem were discussed in every session of a class on logic. Then on the test the students were asked the solution to it. Would there still be a good number who would believe that the wrong answer was the right answer? Yes. If we are dealing with a hard-wired behavioral bias, people's emotions are going to lead them to that answer, no matter how many times they have been taught that it is wrong. This is why people throw up their hands.
But would the students really put that answer down on the paper if they had been told it was wrong in every single class? I don't think so. Social pressure is a psychological reality too. If we taught the right answer over and over and over again, most students would put down the right answer even if they didn't really believe it. This is how people are.
The point is -- behavioral biases can be overcome. You can't do it with pure rationality. That's fair to say. Since we are trying to do battle with behavioral biases, we need to look to some behavioral tools to do the job. Forget rationality! Forget lectures! Forget demonstrations! Apply social pressure. That will work. That will cause people to at least put down the right answer and putting it down will over time get them to actually kinda sorta believe it. Sometimes we don't say the things we know, we know the things we say.
It works this way in investing too. You can't just tell people what works and expect them to do it. No! You need to drill it and drill it and drill it. It's not logic that teaches, but repetition. If you drill it enough, people will catch on. This is not hopeless. We just need to accept that saying what works is not close to being enough to get the job done. You don't fight emotion with rationality. You fight emotion with emotion.
Rob
@Rob Bennett,
ReplyDeleteThe trouble is that real-world applications of psychology are so specific - it is hard to apply social pressure via a general principle when that principle is not recognized.
For instance, I have encountered people who are familiar with the Principle of Restricted Choice (basically the Monty Hall situation applied to a finessing situation in bridge), but who nonetheless put up a stiff fight against switching at Monty Hall.
Ok...I have two choices. I can keep my door or I can switch. If I keep my door my chance of getting the car is 1/3.
ReplyDeleteTherefore if I switch my probabilty has to be 2/3.
By Monte Hall choosing a door he gives a lot of info. In 2 out of 3 times he tells you, in effect, where the car is .
this is ludicrous. you had two goats, now one. you had one car, now one. if there are two rooms, the odds can only be a multiple of 1/2. Changing the number of possibilities does change the odds. Looked at another way, It doesn't matter which door you select as the possible outcomes are the same for each door.
ReplyDeleteA) 1/3 is sticking with your 1 choice out of 3 doors
DeleteB) 1/2 is flipping a coin after the door is opened as this means it's not dependent on the original 1 in 3 choice
C) 2/3 is always switching to the 2 doors you didn't choose allowing you to have both. 2 times out of 3, one of these has the car
Strategy 3 wins.
Having read the paper linked, I wonder if i'm being punked. Did this really get published? No wonder most economits couldn't see the housing bubble. Initially, each door has a one in three chance of winning. If indeed, i was allowed to choose 2 doors, I would have a 2/3 chance of finding the prize.
ReplyDeleteHowever, assuming you are correct, and a door has a 2/3 chance of having the prize both before and after opening a door, then i was a fool not to choose it in the first place. So, let's say i choose door number one, but without opening one: now doors number 2 and 3 each have a 2/3 probability of having the prize? We now have a 5/3 total probability?
Perhaps its a good test because it filters out anyone who can think mathematically (ie. Charlie Munger), clearly a disadvantage to the large firms on Wall Street.
OK, one last try.
ReplyDeleteSo, we opened one of the doors and the two remaining doors have probabilities of one in three and two in three. I switch my choice and open the 2 in 3 chance door. Its empty (or goat). Now the one remaining door has odds of one in three to containg the car.
So, i eliminate two doors, knowing the car is behind one in three and am left with a two in three chance of finding nothing (or a goat).
It appears that eliminating doors does change the odds!
Apparently the Monty Hall test is one that Wall Street firms commonly use to see if potential applicants are suitable material for the madness that is a trading desk in the heartlands of financeland. In so using the test these corporations are, knowingly or not, addressing some of the most hard to overcome behavioural biases that often defeat legions of unprepared investors.
ReplyDeleteISA Allowance
here is an excel simulation showing both actions and an explanation for the 2 logic choices made for filtering the 2nd choice. http://www.gogerty.com/?p=124
ReplyDeleteGreat post!
ReplyDeleteThere is a simulation here: http://serendip.brynmawr.edu/exchange/threedoors
if you don't believe the probabilities. Play a bunch of times using both strategies and see how it comes out!
In a gameshow setting where the host ALWAYS gives you a chance to switch, the equation works. If you go into the game knowing you are going to be offered a chance to switch after you've made your initial choice, it is obviously the better choice to choose to switch doors. However, in a PRACTICAL real-life setting the probability changes. If you are striking a deal with someone (like say a panhandler on the street). Let's say you pay a pan-handler $10 to play a game where you have a chance to choose between three cups, one of which has $100 under it. However, after you have chosen, the panhandler unexpectedly gives you a *second chance* to switch and reveals that there is nothing under one of the cups. COMMON SENSE dictates that the panhandler has nothing to gain by giving you a 'second chance' if you have actually CHOSEN CORRECTLY. The most likely reason if would offer you a chance to switch is if he KNOWS you chose the right one and is trying to trick you into switching. I think that instinctive common-sense knowlege like this interferes with people's ability to understand the logic behind the principal in a game-show setting where it really is just a matter of probability and you don't have to worry about tricks and such.
ReplyDelete