The Monty Hall Problem


In September 1991 a reader of Marilyn Vos Savant's Parade column asked the following question:

Suppose you're on a game show and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what's behind the other doors, opens another door, say No. 3, which has a goat. He then says to you Do you want to pick door No. 2? Is it to your advantage to take the switch?

Marilyn's conclusion was that one should switch.

The confusion arises thus...

The game is really two games!

The first game is a three-door problem. Your odds of choosing correctly are 1 in 3.

The second game is a two-door problem. Your odds of choosing correctly are 1 in 2.

Because your odds are worse in the first game, any choice in the first game is more likely to be incorrect than any choice in the second. Marilyn and friends therefore say that you should switch. But they're comparing apples and oranges.

The probability for a door in the first game cannot be applied to the same door in the second game; it's a new game! The odds for all remaining doors increase when a door is removed from play. The door selected in the first game, which had a 1/3 chance, has a 1/2 chance in the second game.

Marilyn said...

...you should switch. The first door has a 1/3 chance of winning, but the second door has a 2/3 chance. Here's a good way to visualize what happened: Suppose there are a million doors, and you pick door number 1. Then the host, who knows what's behind the doors and will always avoid the one with the prize, opens them all except door number 777,777. You'd switch to that door pretty fast, wouldn't you?

No. Not unless he said that he was going to open all the doors except the one that I chose and the one with the prize. Think about that... If he opened all but mine, then I'd know I chose correctly. If he opened all but mine and one other, then I'd know I chose incorrectly.

What he actually did was open all the doors except mine and one other. Then we know which doors don't have a prize, but we still don't know which of the two remaining does. 50/50!

Okay, let's look at it this way. Let's say that the car is behind door A (with A being 1, 2 or 3, pick any one). Goats are behind doors B and C (the remaining doors). Here are the possible outcomes...

1st Choice Exposed 2nd Choice Outcome
A B A Win
A B C (switch) Lose
A C A Win
A C B (switch) Lose
B C A (switch) Win
B C B Lose
C B A (switch) Win
C B C Lose

So the odds are 50/50, whether or not you switch!

Hmm... Something doesn't seem quite right...

Let's expand the table, and show where the car is, with all the cases for all the doors...

Car Door First Choice Door Exposed Second Choice Outcome
1 1 2 Stay Win
1 1 2 Switch Lose
1 1 3 Stay Win
1 1 3 Switch Lose
1 2 3 Stay Lose
1 2 3 Switch Win
1 3 2 Stay Lose
1 3 2 Switch Win
2 1 3 Stay Lose
2 1 3 Switch Win
2 2 1 Stay Win
2 2 1 Switch Lose
2 2 1 Stay Win
2 2 1 Switch Lose
2 3 1 Stay Lose
2 3 1 Switch Win
3 1 2 Stay Lose
3 1 2 Switch Win
3 2 1 Stay Lose
3 2 1 Switch Win
3 3 1 Stay Win
3 3 1 Switch Lose
3 3 2 Stay Win
3 3 2 Switch Lose

Giving...

50/50! Hmm... Something still doesn't seem quite right... Hey! The first choice was correct in half the cases. That can't be right. Aha! The probabilities of the cases themselves are unequal. The table must be weighted...

Car Door First Choice Door Exposed Second Choice Outcome
1 1 2 Stay Win
1 1 2 Switch Lose
1 1 3 Stay Win
1 1 3 Switch Lose
1 2 3 Stay Lose
1 2 3 Stay Lose
1 2 3 Switch Win
1 2 3 Switch Win
1 3 2 Stay Lose
1 3 2 Stay Lose
1 3 2 Switch Win
1 3 2 Switch Win
2 1 3 Stay Lose
2 1 3 Stay Lose
2 1 3 Switch Win
2 1 3 Switch Win
2 2 1 Stay Win
2 2 1 Switch Lose
2 2 1 Stay Win
2 2 1 Switch Lose
2 3 1 Stay Lose
2 3 1 Stay Lose
2 3 1 Switch Win
2 3 1 Switch Win
3 1 2 Stay Lose
3 1 2 Stay Lose
3 1 2 Switch Win
3 1 2 Switch Win
3 2 1 Stay Lose
3 2 1 Stay Lose
3 2 1 Switch Win
3 2 1 Switch Win
3 3 1 Stay Win
3 3 1 Switch Lose
3 3 2 Stay Win
3 3 2 Switch Lose

Giving...

Or, returning to the more concise form...

1st Choice Exposed 2nd Choice Outcome
A B A Win
A B C (switch) Lose
A C A Win
A C B (switch) Lose
B C A (switch) Win
B C A (switch) Win
B C B Lose
B C B Lose
C B A (switch) Win
C B A (switch) Win
C B C Lose
C B C Lose

Here's a short and sweet explanation: The odds that you made the wrong first choice are 2 in 3. So if you get a chance to change your mind, take it! Assuming that the host isn't trying to trick you... like giving you the second chance only when your first choice was correct...

I think.

*
* *

Some time later, while going through some old files, I came across a clipping from the Mensa Bulletin of March 1996. It was a letter to the editor from John Dwyer, and appears to describe a related Conundrum, a simplified version of the same problem...

Here's an old problem I've never been able to solve:

Suppose there are two sealed envelopes, one of which contains twice as much money as the other — that's all you know. You may choose one, and the rule is you may switch whenever you want until you open an envelope (at which time the game is over).

You pick an envelope, and common sense tells you there's no point in switching. But using probability theory, you reason as follows: This envelope contains $x and the other contains $y. Either y=2x or y=x/2. So if I switch I stand to gain $x or lose $x/2, with equal probability. I'll switch.

Having switched, you think: either x=2y or x=y/2, so... I'd better switch back!

Is probability theory nonsense?...

12 Februaro 2009 modifita, de Ailanto verkita.