1

I have the following problem where I have difficulties grasping the intuition:

Lets say we have three boxes, with two of them empty and one containing a gold price. Lets say we randomly select one of the boxes. After our selection, we are given which one of the remaining two boxes does not contain the price. Now the question is: Should I stick with my original selection or select another box from the two possible alternatives left. What are the probabilities?

I empirically tried this problem by making a computer program to repeat this experiment 1,000,000 times with first staying with the original choice and then always changing the selection. I got the probabilities to be:

$$P(golden\; price\;with\;original\;selection)\approx33\%$$ $$P(golden\; price\;with\;changing\;selection)\approx 66\%$$

Intuitively the probabilities seem at first to be 50% for both of these choices, but it seems it's not the case. I can't grasp on why?...

P.S. please let me know if my question is unclear

jjepsuomi
  • 8,619
  • 1
    http://en.wikipedia.org/wiki/Monty_Hall_problem – Martingalo Feb 10 '15 at 10:59
  • This problem has come to be known as the Monty Hall Problem after a famous american TV game show host. – Matthew Leingang Feb 10 '15 at 11:00
  • Can you rephrase more clearly. Because it is clear that, whatever the state of the box I chose first, at least one of the other box does not contain the prize... So there is no new information, except if I am told which does not contain the prize. – Martigan Feb 10 '15 at 11:00
  • See also http://math.stackexchange.com/questions/1010434/monty-hall-problem – Matthew Leingang Feb 10 '15 at 11:00
  • The simplest proof seems to be to turn it over. What is the probability that after switching you do not get the prize? That would happen if you had chosen the box with the prize, you got shown one of the boxes without the prize, and you switched to the other box without the prize. So the probability of not getting it after switching is the same as the probability of getting it before switching, i.e., 1/3. – Matthew Leingang Feb 10 '15 at 11:03
  • @Martigan I rephrased :) is it better? – jjepsuomi Feb 10 '15 at 11:04

3 Answers3

2

This is the "Monty Hall problem" if you want to look around for more references. Instead of typing out a solution on my cell, I'll just share this lecture. He does it formally the same way that I like to.

Lecture 6: Monty Hall, Simpson's Paradox | Statis…: http://youtu.be/fDcjhAKuhqQ

For an intuitive approach to the problem, consider that at the beginning of the game there's a $\frac{2}{3}$ chance you picked the wrong box. So when a wrong box is eliminated, there's still a $\frac{2}{3}$ chance you're sitting on a wrong box and hence a $\frac{2}{3}$ chance you'll get the right one by switching.

1

This a very famous problem. For it to really make sense, you need to think of it a little differently. Ask yourself, when do I want to switch? When is it a good idea to switch? Well when I have an empty box of course (I obviously don't want to switch if I have the prize). Now, how often do I have an empty box? What is the probability that I have an empty box? Well that would be $2/3$. So, $2/3$ of the time, it is a good idea to switch!

Dunka
  • 2,787
  • 12
  • 41
  • 69
1

If you stick to your choice then you win if your original choice was correct. Probability $\frac13$.

If you switch then you win if your original choice was wrong. Probability $\frac23$.

drhab
  • 151,093