The Monty Hall problem - my head hurts!

Jon

Administrator
Staff member
#1
OMG, I can't get my head around this one!

Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what's behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?
Source: Wikipedia

What would you do and why?

Don't look up the solution first if you haven't yet seen this before.
 

The_Doc_Man

Founding Member
#2
I have seen this and vehemently disagree with the conclusion stated for the truly random case. However, there is the issue that the game-show host knows the answer and thus is not making a random statement.
 

Jon

Administrator
Staff member
#3
Yes, the host knows which door has the goat. But when you are left with two choices, intuitively, it looks like 50:50 chance, But it is 1/3 chance of win if you stay put and 2/3 chance of win if you elect to switch. But then, if you get a new person in and they can choose a door, surely that is 50:50? So how can one be 1/3 chance and a different person a 1/2 chance?
 

The_Doc_Man

Founding Member
#4
My contention is that your odds SHOULD be 50-50 AFTER you have the second question because NEW INFORMATION has been added - i.e. you no longer worry about the door that got revealed. That NEW choice should be 50-50 if it were not for the host guiding the situation. I have seen the studies that were done (even on Mythbusters) and I STILL don't buy it. The original odds were 1/3, and that is no problem. But saying the new odds are 2/3 to win by switching? That is the one I can't buy. Because switching is a new decision.
 

Jon

Administrator
Staff member
#5
Yes, I am with you on that one Doc. My rationale is slightly different. If you are presented with the choice of two options and you don't know which door has the car, you have a 50:50 chance of getting it right. So when they narrow your choice down to two options and ask if you wish to switch, that is the same as saying, "Pick any of the two choices." i.e. 50:50

However, I kind of get it a bit more using the 100 door example. I pick door number #1. The host then opens 98 other doors, showing goats, but leaving one door closed. My door has a 1/100 chance of being the right door. The other "set" has a 99/100 chance of containing the car. But the host eliminated 98 of those doors. So, although your new choice is basically door #1 or door #remaining, you "know" that there is a 99/100 chance of the car being within that other set. When they talked about 3 doors, I just couldn't get the insight. But when they talk about 100 doors, I could.

Aha, I think the reason it is not a 50:50 chance is that we have knowledge that the other set was pruned from 99 to 1 door. So if asked to switch, we are not just choosing between two random doors. We are choosing between the original door we chose and the door that was pruned down. It is a bit like when making a jus when cooking! You boil it down to the essence, with just the remaining flavour and little water. That is what is happening when you prune the 99 doors to 1. You are making it more concentrate, like going from weak probability to strong probability. That kind of makes sense to me!

Apparently, you can prove this concept empirically. They suggest using 3 playing cards. Find the Jack, or something along those lines. Then keep repeating and keep a tally. Anyone got a deck?

Edit: An additional insight is that we know the other set made a jus! But if a new person came along, and only saw two doors, they have a 50:50 chance of picking the right door, since they didn't know the cook was at work on one of those doors. However, since we are aware that Jamie was doing his thing, we know that our door is dilute and the other door is concentrate. Therefore, this additional knowledge tells us we must switch for a better chance.

What do you think Doc? Did that make sense to you?
 
Last edited:

The_Doc_Man

Founding Member
#6
When the amount of information in the game changes, the odds have to change.

When you pick a door and a different door is revealed as a zonk or as a mediocre non-zonk, the odds of the "good" prize change. Before all of that picking, the odds of the good prize being behind a particular door were one in three, with no reason to pick any door over any other. After the pick and first reveal, the good prize is known to be behind one of two doors. The odds are 0% that the good prize is (WAS) behind the door first revealed. I.e. the odds changed for that door completely from 33% to 0%. The odds should change symmetrically for the other two. The ONLY fly in the ointment is that the host knows the right answer so HIS choice of reveal was not random.

I've actually seen it done (on MythBusters) and there is something still wrong with it. It is the way the problem is being expressed, but I can't say it in words at the moment.
 

Jon

Administrator
Staff member
#7
The fact that the simulations prove the odds of getting the car go to 2/3rds on switching (in the 3 door example) suggest that our own perspective is flawed. Reality is the ultimate arbitrator of truth.

I'm getting this Monty problem thing now, having discussed it.

Yes, when the amount of information changes, the odds do change. This is why you should switch because you know the other set is the pruned version, where with every pruning of a door, the probability of the remaining doors having the car go up.

To illustrate this, let's say you have a galaxy with a trillion planets. Assume we know one planet has life, all the others don't. You choose planet #364. We know at this point in time, you have a 1 in one trillion chance of having chosen the correct planet. Therefore, we know that life is almost certainly in the other planets. Let us assume 99.9999999% certain. Given that, if you remove every planet except one from that group, it is almost certain that this is the planet that contains life. So, because you chose planet #364 at random from one trillion planets, and that it is almost certain that the other remaining planet contains life, you should switch.

Make sense?

Or another angle for you...

You have 100 oysters. One has a pearl inside. You know picking at random, which is all you can do, you have a 1/100 chance of getting the pearl. But let's say someone opened all the oysters but two. You would now correct assume that your probability of picking a pearl has gone from 1/100 to 1/2. i.e. pruning increases the odds. So, this principle carries forward to the other planets, to the other doors. Pruning down will concentrate the odds. It does not improve the odds of the door you chose because that was not pruned. Only pruning concentrates.

Note that in my oyster example, you are pruning the entire group, not the group that excludes your initial choice, and that makes ALL the difference.
 
Last edited:

The_Doc_Man

Founding Member
#8
In the final analysis, I have 50-50 odds that the (presumed) pearl is in the one I picked or the one that is left (assuming the odds to be correct). It is a matter of perspective in light of new evidence.

This is why you should switch because you know the other set is the pruned version, where with every pruning of a door, the probability of the remaining doors having the car go up.
The probability of the remaining doors MUST go up symmetrically. There is, by the rules of the game, a 100% chance that there is a car behind one of three doors. The odds, in the absence of other information, are 1/3 for each door. Now we see one door and the odds change. The door that was revealed drops to 0 chance, but the odds STILL must total 100% (that there is a car behind one of the doors). We STILL don't know which one, but this HAS to happen: The odds associated with the revealed door (1/3) have to be redistributed because they have to total 100%. I.e. they didn't change the game and take away the car. So the total HAS to be the same and one of the doors is now 0. So the OTHER doors must still total 100%.

The question that I ABSOLUTELY CANNOT SEE is why the odds would be redistributed asymmetrically. The answer by experiment SEEMS to be that the 1/3 odds that have left the revealed door go entirely to the other door, and that uneven distribution CANNOT HAPPEN. There is no reason for the asymmetry. The odds SHOULD go 1/2 of 1/3 (=1/6th) to each remaining unrevealed door, which becomes 1/3+1/6 = 3/6 = 1/2. Which is 50-50.
 

Jon

Administrator
Staff member
#9
Let me try a socratic approach to see if we can agree on some granular stuff.

1. If you have one million doors, would you agree that the probability of the car being behind one of the non-selected 999,999 doors would be 999,999/1,000,000?

2. Can we call this 999,999/1,000,000. "almost certain", is that fair?

3. Can we call the 1/1,000,000 chance "extremely unlikely", is that fair?

4. If you agree to 2 and 3, can we say that it is extremely unlikely that you selected the door with the car and that it is almost certain that the car is behind the other set of doors?

5. If you agree to #4, do you accept that whatever you decide to do to the non-chosen set of doors, it will always remain that it is "extremely unlikely" you have chosen the correct door?

6. If you prune all those doors in the "almost certain" group down to just one door, do you still agree that the chance of a car being behind that door is "almost certain"?

I am making it granular so I can identify which specific point we are at odds over.

If you can answer Yes/No/Unsure to questions 1 to 4, we might get to discuss the specific point in dispute.

[Note: if I only use 3 doors, I cannot see it. I have to use huge numbers of doors to get the clarity.]
 
Last edited:
#10
Steps 5 & 6 is where the failure occurs. You start with a million doors, each of which has a small but non-zero probability of being the "YES" door. Your first choice (selecting a door) CANNOT CHANGE THE ODDS. The car is still where it always was.

As you reveal the hundreds of thousands of "wrong doors" the probability changes each time, distributing that fraction of "YES" to the remaining doors. The total must never be anything OTHER than 100% INCLUDING THE DOOR YOU CHOSE as long as that door has not been revealed as "NO." The odds keep increasing in your favor (fractionally) with every revealed "NO" door. It is a matter of whether you look at it as one problem or 999,999 problems. There will be 999,998 redistributions of odds to get to the "two doors remaining" case. At that point, the odds should be 50/50 for me to choose to stay or swap.
 

Jon

Administrator
Staff member
#11
Doc, you haven't stated whether you agree Yes to my first 4 points??

I have thought up another way of explaining this!

I am the host of the show. There are 10 doors. Since you believe there is always a 50:50 chance of getting the car when two doors are left, you see no reason to choose any door except door #1 and never switch. As the host, I have no prior knowledge of your strategy and I set the shows strategy in stone before we play.

As the host, I decide to put the car behind door #1 on the first go, door #2 on the second go and so on until you have played the game 10 times.

So what are the results of this experiment.

#1. You win the car.

#2. You get a goat.

#3. You get a goat.

#4...#10 You get a goat.

So, in this example, you play 10 times and because you see no point in switching then you win only once out of 10. But if you switched, you would win 9 times out of 10.

Do you agree with that?
 
#12
No, because once I won the car I would stop. (Yes, I know that is evasive.)

Your problem is whether I am required to play the same door each time. If I am, then you are right. If I am not, my expectation value is 3 cards if I play 10 times, or 10 cars if I play 30 times.
 

Jon

Administrator
Staff member
#13
Your problem is whether I am required to play the same door each time. If I am, then you are right. If I am not, my expectation value is 3 cards if I play 10 times, or 10 cars if I play 30 times.
Actually, it doesn't matter or not if you are required to play the same door each time, and empirical reality supports my viewpoint. I did that to try to simplify the example. But obviously you can see from this example that if that is the strategy you chose, then it would not be 50:50, correct? You have just admitted that right there. So, how can you say that it is 50:50 when in fact I can cite this strategy which is not 50:50? A contradiction, don't you think?

Let me run an empirical experiment for you, our own Monty game. I have 10 cards here. One is a Joker. I will shuffle, pick a card at random. Then, I will prune the non-picked cards down to only one card, but only throwing out non-Jokers as I go.

Your hypothesis suggests I should have a 50:50 chance of picking the Joker on the first guess. Mine is that I only have a 1/10 chance and therefore you should always elect to switch. Let's play!

Monty Trials:

1. Non-Joker. I should have switched.

2. Non-Joker. I should have switched.

3. Non-Joker. I should have switched.

4. Non-Joker. I should have switched.

5. Non-Joker. I should have switched.

6. Non-Joker. I should have switched.

7. Non-Joker. I should have switched.

8. Non-Joker. I should have switched.

9. Non-Joker. I should have switched.

10. Non-Joker. I should have switched.

Ok, that was quick to do. If it is supposed to be 50:50, how do you explain the above results? Try it yourself Doc. It took me about 2 minutes.

You have a hypothesis which does not match up with reality. But you are clinging to that hypothesis. Does a good scientist stick to a hypothesis if it does not match up with reality, or do they modify their hypothesis?

If you think I am mistaken and that my results were highly unusual, do the test. In fact, anyone else who is reading this, I encourage them to do the test too. It is extremely quick to do.

Naturally, with a larger sample size, the answer will converge on 1/10th chance of picking the Joker.

Let us have the flexibility to undo lodged beliefs so we can see things as they really are. This Monty problem is counter-intuitive, so I can understand why it is hard to see.
 
Last edited:
#14
There is a difference between clinging to a false hypothesis and claiming to not understand why a particular process appears to violate the rules of probability; specifically the "switch/stay" decision and what appears to be an asymmetric redistribution of probability. I know the answer to the strategy, having referenced the MythBusters episode more than once. I am still trying to understand WHY probability is unevenly redistributed and there is my real issue.
 

Jon

Administrator
Staff member
#15
I found that by physically doing it with 10 cards, pruning down to just the 2 cards, it becomes obvious why the probabilities are far better to switch. It becomes very fuzzy if dealing with just 3 cards. But the physical act brings clarity. I recommend you try it, should you feel that way inclined.

I've never watched MythBusters though.
 
#16
Jon, as stated before, I understand the result. I know it has been shown. The problem I don't understand is how the redistribution of probabilities appears to be asymmetrical when there is no obvious reason for it. The fact that it is better to switch has been demonstrated. But I am seeking a deeper understanding of WHY.
 
Top