This is an extremely important part of the problem that is always glossed over in explanations of the Monty hall problem, and it really bothers me that it's ignored because it is crucial to determine what is the "correct" decision.
If the game show host is opening a door at random (i.e. it's possible that he opens the door with the car behind it OR the door you already picked) then the outcome is 50/50 whether you switch or not. But if the host knows what's behind the doors, and purposefully opens one of the remaining two with the goat, then you should switch doors to increase your odds to 66 percent.
On the above article this is only briefly addressed and not explained. In many tellings it is not addressed at all.
It’s also helpful (but not essential) to realize that the choice of which door to open has two parts, a rule and a decision. If the participant is doing a switch or a stay strategy, the decision aspect has no impact on winning.
The rule part is Monty can’t reveal the car. The decision comes up if the participant chooses the car first. Monty will have two doors to choose from to open. But if you have a stay strategy you’ll win for either choice, and if you have a switch strategy you’ll lose no matter what. So his decision doesn’t matter.
This makes it very easy to simulate. It also might lead to people to understanding the solution.
If you pick wrong first, you’ll see staying always loses and switching always wins. The probability of picking right first is 1/3, of picking wrong first 2/3. Since switch always works when you pick wrong and you pick wrong 2/3 times, go with that.
> If the game show host is opening a door at random (i.e. it's possible that he opens the door with the car behind it OR the door you already picked) then the outcome is 50/50 whether you switch or not.
I don't think this is true. Assuming you mean that if the host opens my box, I switch to one if the other two at random with equal probability (even if he opened my box and it was the car), and assuming if the host doesn't open my box that I stay or switch to the other unopened box with equal probability (even if the host opened the box with the car), I think the odds are 33% that I win the car, and 66% that I don't.
You pick door #3. The host says they're feeling generous so he is going to give you a free door. He blindly reaches into a bag with three marbles labeled 1,2, and 3. He pulls out the marble labeled 2. He opens door 2 and there's a goat behind it. Do you now switch to door 1? The answer is that it doesn't matter, it's 50/50.
Right, in the world where the door he already opened was done randomly (i.e. it could have been a goat or it could have been a car, and it could have been the door you also chose), then if he happened to both (a) open one of the doors you didn't pick, and (b) he opened a goat door, then switching doesn't matter.
But that's only one branch of the probability tree, and if you investigate all of them (assuming he chooses a door at random equally, might open yours, and might open a car), then it's not 50/50, it's 33/66 I think.
Perhaps you just weren't clear enough that you think the 50/50 is limited to only a subset of the possible outcomes in your scenario.
Sure it's only one branch of the probability tree of the game as a whole.
But the way the problem is frequently framed is this (I acknowledge that the OP does briefly make the important distinction though):
"The game has three doors, two with goats and one with a car. You pick one door. The host selects another for free and shows you a goat behind it. Do you want to switch doors?"
My point is that, given this scenario, to make an informed decision you need to ask the host: "Did you know where the goats and cars were? And did you open the goat with certainty, or did you do so at random?"
The hosts answer to this question then determines whether you should switch to gain a 33 percent advantage or that it does not matter.
What would "switching" even mean if the door that was opened was the door that you had picked and the car was there?
What would be the point of switching to the remaining door if the door that was opened was not the one you picked and the car was there? (The probability of getting the car would be zero either way.)
You're right that the subspace of scenarios where (a) and (b) happen is just part of the full space of things that could have happened - but it's the only part relevant for the problem. (We're explicitely told that (a) and (b) are true!)
That's the point though, no? The only way to justify the 50/50 claim made above that I can see is that the host must choose the door to open at random from among all 3 doors. So scenarios do arise where the host opens your door and it is the car, or that the host opens some other unchosen door and it is the car.
If you don't allow for those possibilities, then the host really isn't making their choice with equal probability.
> If you don't allow for those possibilities, then the host really isn't making their choice with equal probability.
I don't understand your point. The host can make the choice with equal probability and all those possibilities were initially allowed. Once that a choice was made some possibilities are obviously no longer allowed.
Suppose you’re on a game show, and you’re given the choice of three doors. Behind one door is a car, behind the others, goats. You pick a door, say #1, and the host, who knows what’s behind the doors, opens another door, say #3, which has a goat. [It's implicit that his knowledge is used to pick a goat with certainty.] He says to you, "Do you want to pick door #2?" Is it to your advantage to switch your choice of doors?
This is one possible 50/50 scenario:
You pick a door, say #1, and the host picks a number between 1 and 3 from a hat, say #3, and opens that door, which has a goat. He says to you, "Do you want to pick door #2?" Is it to your advantage to switch your choice of doors?
The answer in the first variant is "yes, switching from #1 to #2 doubles my probability of winning". The answer in the second variant is "it doesn't matter, the probability of winning is the same".
The scenarios that didn't happen are irrelevant. We are not been asked to commit to a strategy in advance. We are asked if we want to switch from door #1 to door #2 after we learn that door #3 was not the good one. That depends only the scenarios that still remain possible and their probabilities.
A much simpler problem to stress that the question is whether we want to change conditional on what we know - not all the things that were possible a priory:
You're given the choice between two evelopes. One contains $1 and the other $10. You open an envelope and find $1. Is it to your advantage to switch to the other envelope?
The answer is obviously yes. If you found $10 the answer would obviously be no. But if you have to commit to switching or not before you see what was your first choice the answer would be "it doesn't change anything".
Your answer to the question is conditional on what may have happened. You don't have to "allow for other possibilities" that could have happened but you know already that didn’t.
The original situation covers all scenarios. Any choice of door you make and all possible doors the host might open. The odds are the same every time.
The proposed 50/50 scenario only makes sense for some possible situations - those where the host draws a door at random AND the door isn't the same one I chose AND the door opened isn't the car... once you put that situation on equal footing with the original one (all possible doors chosen and all possible doors opened) then it isn't 50/50 any more. Some of those situations are 100% wins (I chose door 3, host opens door 3, car is behind door 3) and some are 100% losses (I chose door 2, host opens door 1, car is behind door 1). Once you look at all outcomes, 50/50 doesn't apply. It becomes 33/66.
If your point is that you can cherry-pick a subset of outcomes and get 50/50 odds then I can't really argue, but I'm also afraid I don't how that's useful.
> If your point is that you can cherry-pick a subset of outcomes and get 50/50 odds then I can't really argue,
Why do you say “cherry-pick a subset of outcomes”? It’s the subset of outcomes relevant for the question.
Consider the simple problem that I proposed: “You're given the choice between two envelopes. One contains $1 and the other $10. You open an envelope and find $1. Is it to your advantage to switch to the other envelope?”
Do you agree that the answer is “yes” or would you argue that “once you look at all outcomes” the 100% probability of getting $10 by switching doesn’t apply?
> but I'm also afraid I don't how that's useful.
It’s useful to answer the following question:
“You pick door #1. The host picks a number between 1 and 3 from a hat and opens the corresponding door #3. There is a goat behind the door. He says to you, "Do you want to pick door #2?" Is it to your advantage to switch your choice of doors?”
"I claim the odds of flipping a fair coin 100 times in a row and getting heads every time is 50/50, but only in the situation where you happened to flip 99 heads in a row already".
Of course it is true, but again, not useful in my opinion.
What about the envelopes problem? Do you think that claiming that you should switch the envelopes “but only in the situation where you happened to” find $1 in your first pick is not relevant? The question asked is precisely what to do in that particular situation! Knowing that if you switch you get more that if you don’t seems useful…
I’m also curious about your answer to the following question, really.
“You pick door #1. The host picks a number between 1 and 3 from a hat and opens the corresponding door #3. There is a goat behind the door. He says to you, "Do you want to pick door #2?" Is it to your advantage to switch your choice of doors?”
What do you think that the probabilities would be if you you were to find yourself in that situation? A hint: the probability that the car is behind the open door is zero. What about the others?
In your hypothetical situation, the rules are something like:
"Pick a door, then the host will randomly select one of the other doors. If the host doesn't select your door (1/3 odds) AND the host doesn't open a door with a car behind it (2/3 odds) then you can choose to switch or keep your door."
The odds of winning _after_ the host already beats 1/3 and 2/3 odds is indeed 50/50, but the odds of playing the game _overall_ is 33/66.
The only real way I can see of making this situation analogous to the Monty Hall problem is to consider the odds of the whole game (like Monty Hall does, which is always 66% chance of winning by switching, every time, no matter the path the game takes from start to end). If you want to cherry-pick just the situation where the host ALREADY beat some odds, then you do get 50/50, but it really is a non-sequitur in this thread (or in any thread about Monty Hall).
> The odds of winning _after_ the host already beats 1/3 and 2/3 odds is indeed 50/50, but the odds of playing the game _overall_ is 33/66.
The problem is not about the game overall anymore than the question about whether you'd like to switch the envelope after you got $1 is about the game overall. The question is whether you want to switch conditional on the situation where you got $1!. That's not cherry-picking - it's considering the exact situation described in the problem.
> The only real way I can see of making this situation analogous to the Monty Hall problem is to consider the odds of the whole game
The Monty Hall problem is not about "the whole game". The Monty Hall problem is quite explicitely about what would you do in the following situation:
1) you were given the choice of three doors: one car and two goats
2) you picked a door
3) the host opened another door
4) there was a goat behind that door
After all those things have happened you're being offered to switch from your initial pick to the remaining door.
What is the probability - after all those things have happened - that the car is behind each of the doors?
If you don't agree that this describes the problem there is no point in reading further. We can just agree that we have very different understandings of what the problem is about.
-----
If you agree with the description above the answer to the question depends your assumptions about how the host's choice in step 3) was done.
There are many different assumptions that could be made.
If he was avoiding the door you picked and the door with the car - as the original formulation of the problem implies - the probababilities are 1/3 and 2/3 (for the door you picked and the remaining one respectively).
If he was completely ignoring the location of the car when he made the choice the probabilities are 1/2 and 1/2 (it doesn't matter if he was avoiding your door or not).
> The problem is not about the game overall
> The Monty Hall problem is not about "the whole game".
The Monty Hall problem is about the whole game. That's the difference, and that's why the 50/50 claim doesn't make much sense here.
> The Monty Hall problem is quite explicitely about what would you do in the following situation: [...] After all those things have happened you're being offered to switch from your initial pick to the remaining door.
Steps (3) and (4) in the Monty Hall problem are a bit more nuanced - the host opens a door guaranteed to be a goat, because the host knows where the goat is.
At this point, when you choose to switch or stay with your door, you have a 1/3 chance of winning if you stay, and a 2/3 chance of winning if you switch.
But the key point that you are missing is that this is the same for every single time you play the entire game. It doesn't matter where the car is in step (1), or which door you choose in step (2), or which door the host opens is step (3) - if you always switch doors, no matter what, every single game you double your odds.
And in the 50/50 scenario posed, that just doesn't hold. It isn't analogous.
> But the key point that you are missing is that this is the same for every single time you play the entire game. It doesn't matter where the car is in step (1), or which door you choose in step (2), or which door the host opens is step (3) - if you always switch doors, no matter what, every single game you double your odds.
The key point that you are missing is that it doesn't need to be "the same for every single time you play the entire game" to have a well-defined solution for the particular realization of the game described in the problem statement.
And the solution for the problem where (1),(2),(3),(4) happen with the host picking a door at random is 50/50. That's what cman1444 said.
You're asked a question conditional on (1),(2),(3),(4) happening and the things that didn't happen are irrelevant.
Just like if you're asked if you want to switch the envelopes when you got $1 you will say "yes" and it's completely irrelevant that you could have got $10 and you would have said "no" in that case - because you didn't.
In conclusion, some people find the problem where (1),(2),(3),(4) have happened and the host was picking a door at random (and its 50/50 solution) interesting and you don't. De gustibus non est disputandum.
"I claim the odds of flipping a fair coin 100 times in a row and getting heads every time is 50/50, but only in the situation where you happened to flip 99 heads in a row already".
If I'm told that I'm in a coin flipping contest and (1) in my first flip I got heads, (2) in my second flip I got heads, ..., (n) in my n-th flip I got heads, ..., (99) in my 99-th flip I got heads and then I'm asked what's the probability that I get to 100 heads in a row there are different answers that I could give.
I could assume that the probability of every single flip is 50/50 and then my answer will by 50%. Or I could give an answer higher than 50% if I'm able to control the outcome - or if by now I suspect that the coin has two heads. Or I could give an answer lower than 50% if I suspect the game is rigged and they are going to make me lose now.
Whetever my assumptions about the upcoming flip, my answer will be conditional on (1),(2),...,(99) having happened already. There is no point in thinking about the "whole game" at this point in time. I already got heads 99 times - just like in the Monty Hall problem I already picked a door and he already picked another door with a goat.
As far as solving the problem in an academic sense, your explanation is clearer. However, one of the things that makes the whole thing fascinating is that the actual players are not told this specific information, that the host knows. If you watch the show religiously it might dawn on you, hey, he must not be picking the preview door randomly because he has never accidentally revealed the car -- but that's beyond most people's observation.
In practice, the revelation of one door is likely to make the player even more committed to the original choice -- the odds (seemingly, not actually) just went from 33% to 50%, momentum is good, I'm feeling lucky, no way am I going to switch now!
The "Big Deal" -- the game where the player chooses one of three doors -- was the finale of every episode of Let's Make a Deal under Monty Hall. Still is, in the Wayne Brady revival. Not hypothetical.
Not "rigging" the decision, would be pointless in the game. Yes, it is failed to be mentioned, but if the host revealed the car, then what's the point?
Where I live most cars have 4 doors, but since we are expected to pretend this is a well-posed problem we must assume there are literally 3 doors at the game host location.
Evidently the participant entered through one of them and already knows whats behind one door. If he saw the car during entry, the correct choice would be the door of entry. The host can open (and close, and open, ...) doors with goats all day long, the participant would know the car is where he just saw it.
If the participant did not see the car during entry it must be behind one of the remaining doors: either behind the back door, or behind the emergency exit door.
Clearly it would be highly illegal to obstruct the emergency exit door with a car, so under Standard Assumptions (see assumption 42, the last one in the code) we can assume the host did not park the car behind this emergency exit door. Especially since emergency exit doors open outwards.
Hence the car was parked behind the remaining backdoor, and since this follows from the Standard Assumptions we can conclude the participant made the same (by definition valid assumption) as described here, since its a Standard Assumption, and assumption 1 of the Standard Assumptions is that in the absence of necessary knowledge, everyone makes the Standard Assumptions.
Given that the participant (using the standard assumptions) knew immediately which door hides the car and given that this door was his initial "guess", the correct solution (under Standard Assumptions) is to stay with the initial choice.
If they pick randomly, there's no benefit to switching. No harm either.
If it's random, 1/3 of the time you're right on your first guess and switching doesn't help. 2/3rds of the time you're wrong - if the host knows and deliberately doesn't open the door showing the car switching means you win. If however he opens randomly, half of those cases show the car and you can't win - so the breakdown is 1/3 switching wins, 1/3 switching fails and 1/3 the car is shown and you lose regardless.
If the host picked randomly then there would be a 1/6 chance that the show would end rather anticlimactically and you would be indifferent in switching. Now this 1/6 becomes your edge turning your 1/3 to 1/2 after the goat is revealed.
1/3 not 1/6. There are six outcomes of you picking a door and the host opening one of the remaining doors. In two of them you pick a goat the first time and then the host reveals the car.
Also, does the goat come butchered or do you need to do it on stage?