Path: Eric's Site / Math / Monty Hall Three-Door Problem | Related: Dice, Monty Hall, Pi From Coin Flips, Springs and Ropes (Site Map) |
Difficulty level: Junior high school. |
Instead of three doors and a prize, we can use three playing cards, one of which is red. This web page contains a simulation of the problem in JavaScript. The results are below. To see the JavaScript code, look at the source for this page. If you prefer C, here is C source code. To rerun the simulation, reload this page. The results are:
The simplest explanation is this: By chance, the contestant picks a black card first two-thirds of the time. Then when they switch cards, they get a red card instead. So switching gives a red card two-thirds of the time.
Here is a more detailed explanation of why switching yields the red card two-thirds of the time:
Many incorrect explanations assume that because there are two doors or cards left to select from, they have an equal probability of being the prize or the red card. To see that this is incorrect, consider what happens if the player's first choice was or was not the red card. If it was, the dealer's action does not provide any information about the two other cards, since they are both black. But if the first choice was not the red card, the dealer's action identifies the red card. That is, the dealer provides information, which affects the probability. In the one-third of the cases when the initial choice is the red card, switching just gets you a black card. But in the two-thirds of cases when the initial choice is not the red card, switching takes advantage of the fact that the dealer has effectively identified the red card.Imagine three thousand card tables. On each of the tables, a dealer has randomly arranged one red card and two black cards, face down. At each table, you randomly put a marker labeled "1" on one of the cards, representing your first choice.
It is obvious that about one thousand markers will be on red cards and about two thousand will be on black cards, from simple chance.
At each table, the dealer turns over a card that is both black and not under the marker. Now each table contains one face-up black card, one card with a marker, and one remaining face-down card. You put a marker labeled "2" on that last card at each table, indicating it is the second choice.
It is obvious that the cards under the "1" markers have not changed, so one thousand of them are still red. And two thousand are still black.
Since each second-choice card must be red if the first-choice is black and vice-versa, about two thousand of the second-choice cards must be red. Switching must give a two-thirds chance of getting the red card simply because two thousand of the three thousand cards under the "2" markers are red.
Other incorrect explanations count the possible sequences of events:
For proof, use the elementary rule of probability that the probability of A given B (the probability that event A occurs if you are given the information that B occurs) equals the probability of both A and B occurring divided by the probability of B occurring. Let A be the event that the first choice was the red card. Let B be the probability that the dealer reveals a black card. The probability that the dealer reveals a black card is 1. The probability that the first choice is red and the dealer reveals a black card is 1/3. Therefore the probability that the first choice is red given that the dealer reveals a black card is 1/3. So after the dealer reveals a card, the probability that the first choice is red is 1/3. Since there's only one other unrevealed card, the probability that it is red must be 2/3.
Path: Eric's Site / Math / Monty Hall Three-Door Problem | Related: Dice, Monty Hall, Pi From Coin Flips, Springs and Ropes (Site Map) |
© Copyright 1998 by Eric Postpischil.