6

I ran across an apparent paradox which I then located in the paper The Box Problem: To Switch or Not to Switch as such:

Imagine that you are shown two identical boxes. You know that one of them contains. \$b and the other \$2b. Picking one at random and opening it, you must decide whether to keep it (and its contents), or exchange it for the other box.

In short, when you find $x dollars in a box, the expected value of the other box is .5*.5x + .5*2x = 1.25x, meaning that it's always better to switch. This appears to violate the symmetry of the problem and the fact that you still know nothing meaningful about either box.

The paper goes another direction with it, talking about how having prior knowledge of expected values gives a more meaningful analysis (along with other discussions). However, what if there's no prior knowledge, and we have the original problem as stated. Can anyone give me some intuition to make sense of this?

EDIT: Someone found this question which asks a slightly different formulation, but an identical problem. The accepted answer there just points to a paper, and I'm having difficulty understanding the paper. It explains away the paradox by noting the expectation is based on an infinite sum, and the value depends on the order the sum is evaluated. I'm not familiar with how the order of a sum can change a value, and I also don't see talking about a different way to evaluate the expectation explains the strange result outlined above. My math understanding is primarily based on reading textbooks as a hobby, and I haven't yet worked up to fully understanding math academic papers, so a simpler explanation would be helpful.

  • 1
    See http://en.wikipedia.org/wiki/Envelope_paradox (a closely related problem in which you don't open the first box before deciding whether to switch). – joriki Jul 05 '13 at 06:28
  • 1
    The accepted answer there just points to a paper. After reading the paper, I still have no intuition here. I apologize, but my math knowledge is that of a hobbyist, and I would still appreciate a simple explanation. – Cannoliopsida Jul 05 '13 at 06:39
  • That doesn't change the fact that this is a duplicate of that question. To make it a new question, you could edit it to point out what you're missing in the answer to that question -- as specifically as possible, but if the problem is that you don't understand that answer well enough to be able to describe what you're missing in it, you could just ask for an explanation in simpler terms -- in that case it would help if you describe as specifically as possible what sort of level of explanation you're looking for. – joriki Jul 05 '13 at 06:50
  • And not a free article either. – Git Gud Jul 05 '13 at 07:08
  • 3
    Edited to note that I have now seen this duplicate question and how I don't follow the paper given as an answer. Thanks for the pointers! – Cannoliopsida Jul 05 '13 at 18:05
  • What's wrong with the (not accepted, but more highly voted) other answer to the duplicate question? – Peter Taylor Jul 11 '13 at 09:49
  • I'm not totally sure how that answer is relevant (as in, I'm clueless, not that there's necessarily anything wrong with the answer). I'm guessing that it's talking about the values of $b, but even if you pick some distribution, you still have the paradox, since the paradox isn't using the distribution. – Cannoliopsida Jul 11 '13 at 20:42

2 Answers2

4

You asked for "intuition to make sense of this" and that's what I hope to provide. I state that there is no paradox and, despite the symmetry of the situation, it IS better to switch.

Suppose you walk into a not-for-profit casino where there is no house advantage and place a double or nothing bet for \$x. There is a 0.5 probability that you will win and recieve \$2x (your stake plus \$x winnings) a 0.5 probability that you will lose and walk away empty-handed. The expected winnings are 0.5*\$2x + 0.5*\$0 = \$x. Thus the game is perfectly 'fair' and there is no statistical indication for whether or not you should play. It's just a matter of how lucky you feel...

However, what if the casino offered to give you back half your winnings if you lose? Then you'd be crazy not to play. You'd have a 0.5 probability of winning and getting \$2x and a 0.5 probability of losing and getting back only \$x/2. The expected winnings now are 0.5*\$2x + 0.5*\$x/2 = \$1.25x. It's just like the problem you describe.

Hopefully one can intuit now that a double-or-halve bet is better than a double-or-nothing, and since the double-or-nothing is a fair bet then the double-or-halve one is clearly stacked in your favour. Hence you should take it if offered and that's why it is always good to swap the boxes in the question as originally posed.

This explanation, of course, is only valid for a constant value of x so cannot be applied to the original formulation of the 'Two Envelopes' question, a discussion of which is beyond the scope of my evening. I added this disclaimer because I'm wary of being accused of providing an irrelevant 'lay perspective'.

1

This page may shed some lights:

https://en.wikipedia.org/wiki/Two_envelopes_problem

alian
  • 163
  • But 50% x/2 + 50% 2x = 1.25x . – Cannoliopsida Jul 12 '13 at 20:05
  • Sure, but that statement is unrelated to the problem. It may be an attempt at confusing or creating an illusion of an apparent paradox. – alian Jul 12 '13 at 23:49
  • What do you mean it's unrelated? You said in your answer that the expected money is 50% x/2 and 50% 2x. This equals 1.25x. – Cannoliopsida Jul 13 '13 at 00:34
  • Sorry to be obtuse, but why can't you simply add the multiplied values? I get that they're exclusive, but that's normally exactly the condition where you would multiply and add. Are you saying that for some reason it's impossible to calculate the expected value in terms of x? Why? – Cannoliopsida Jul 14 '13 at 05:12
  • An expected value is vaguely what the outcome per event would be if we carried out the experiment an infinite number of times. So with infinite number of boxes each having random .5x and 1x dollars you would expect 1.5x per box. Also in your equation 2x/(x/2)=4 so this is giving the wrong ratio of money in the boxes. Money in one box should be twice as many according to the problem, not 4 times as many. With p=50% then the expected outcome is .52x+.5x=1.5x after one trial. – alian Jul 14 '13 at 17:34
  • I think you're explaining the expected value over a long period of time where x is known? The question is about where x is unknown and whether you should switch after looking in a box. This is why my math differs: if you see x, you don't know if the other box has x/2 or 2x. – Cannoliopsida Jul 15 '13 at 04:57
  • Considering uniform distribution and that in reality money is limited there will always be a limit to the max $ per box. So you will be able to make predictions on the maximum after a number of trials and then based on that decide which box to pick subsequently for higher gain. So the question may be split in two; whether there is a limited amount of money available to draw from(reality) or infinite money available? – alian Jul 15 '13 at 16:27
  • Given that this is an abstract math problem, I think it's a fair interpretation that there is no largest number and that x could be any value, but I don't think the particular value of x is what matters here (for simplicity, let's assume that x is finite). And again, the point is not to find patterns after a number of trials. The question is about the apparent paradox after a single drawing and wondering whether to switch. – Cannoliopsida Jul 15 '13 at 17:31
  • The first problem is that 50% x/2 + 50% 2x is wrong! It gives ratio of money 4:1! The ratio of $ in the two boxes is always 2:1. The second problem is that this question is not clear and precise as I indicated earlier. I don't see any paradox. – alian Jul 15 '13 at 18:51
  • You misunderstand the problem. If you open a box and see $x, you know that the other box has either $2x or $.5x. This is because you don't know if you opened the bigger or smaller box to begin with. Thus, there actually is a a 4:1 ratio between the two possibilities.

    I'm aware that there probably doesn't exist a paradox, I just don't see how the expected value of switching is always greater than what you have. This isn't like Monty Hall where you've gained some information.

    – Cannoliopsida Jul 16 '13 at 16:34
  • There are only two possibilities after you've opened the first box and now have x dollars. First possibility is that you find 2x. In this case 2x/1x = 2:1 everything is good. Second possibility is that the second box has .5x, x/.5x=2:1 again the same. Those two events are EXCLUSIVE they cannot happen simultaneously so you cannot find both twice the money and half the money to get 4:1 ratio. Give me an example with a fixed dollar amount that the ratio is 4:1? – alian Jul 17 '13 at 00:34
  • It's not that the other box is 4 times what you have. The difference between the two possible values is 4:1. Let's say you open a box and find $10. This means the other box either contains $5 or it contains $20. $20 is 4 times $5. As there is .5 probability of either case, the expected value is .5 * $20 + .5 * $5 = $12.50. My question is essentially for an intuition of why you always expect the other box to be greater. – Cannoliopsida Jul 17 '13 at 00:57
  • I fully understand that the events are exclusive. That's exactly why I calculate the expectation as the sum of values times probabilities. There's 50% of $5 and 50% of $20 and 0% of both $5 and $20. – Cannoliopsida Jul 17 '13 at 00:58
  • Can you precisely define what you call the expected value? – alian Jul 17 '13 at 03:37
  • You are confusing the definition of a trial. It consist of opening two boxes. What you call a trial is having already opened one box then what is the expected value of the second box only. Your trial consist of only the second box. So you are calculating the expectation value of the second box only. Is there a link to the original problem defined? – alian Jul 17 '13 at 03:50
  • I'm using the standard definition of expected value as "a weighted average of all possible values" (http://en.wikipedia.org/wiki/Expected_value). The original problem is linked in the question (http://www.jstor.org/stable/2691373?seq=1) and if that link isn't available (JSTOR is quirky based on what network you're on), google "The Box Problem: To Switch or Not to Switch". I'm not talking about infinite trials; this is a probability problem, not a statistics problem. And yes, I'm considering the conditional expectation of the second box, given the first box being open. – Cannoliopsida Jul 17 '13 at 04:40
  • It is explained in the paper why it seems that based on the expectation value you would chose the second box. The issue is that the prob distribution is not stated. So you cannot assume 50% chance. The consequence of this is also asymmetry giving more prob weight to the higher value because .5x is only .5x from x, but 2x is 1x distance from 1x, hence you get 1.25x for the expected value. I don't see anything outstanding. The rest of the paper goes then into what happens when different prob distributions are assumed.http://econ.as.nyu.edu/docs/IO/9393/RR90-26.pdf – alian Jul 17 '13 at 06:30
  • So if we modify the question to assume a uniform distribution, we can then assume a 50% chance? It seems like that brings the paradox back in. – Cannoliopsida Jul 17 '13 at 16:34
  • Yes, then the apparent paradox is that from the expectation value you would guess that choosing the second box is better. Think about what happens if it were that you either get 1/2x or 3/2x? The expectation value is just x. How about 1/2x or 5/4x, then you expect .875x. Now it's better to chose the first box, even though the prob distribution is the same? The expectation value does not necessarily indicate the way to a better game. – alian Jul 17 '13 at 17:34
  • I don't follow your examples. If you found .5x, the other box would have .25x or 1x. .5.25x + .51x = .625x. Then we note that .625x > .5, so the other box still appears better. Multiplying by a scalar changes nothing. I don't think you're thinking things through very well. Sure you have a lower expectation on the second box, but that corresponds to an even lower value on the first box (but since these are in terms of an unknown, nothing is meaningfully different). Finally, that's exactly what expectation means in the context of a game. Are you familiar with expectation in probability? – Cannoliopsida Jul 18 '13 at 05:21
  • No, I meant what if you changed the game so the second box is either half or 3/2 etc...and I probably do don't you think? – alian Jul 18 '13 at 06:07
  • Ok given that, I see what you're getting at. I guess the crux of the issue then is why is expectation suddenly meaningless? In the couple probability and game theory texts I've read, expectation was always exactly the way to, well, predict what value you'd expect. It seems like the expectation is no longer doing that. – Cannoliopsida Jul 18 '13 at 15:41
  • It's not the case, the line of reasoning is wrong for the expectation. Look here: https://en.wikipedia.org/wiki/Two_envelopes_problem which explains exactly what I've been trying to tell you. The issue is you are mixing different instances of the variable in the expected value equation. Also:http://en.wikipedia.org/wiki/Expected_value – alian Jul 18 '13 at 16:15
  • Your explanation is wrong as noted in that wikipedia page. Look at the 'alternative interpretation' section where examining the first envelope is considered: "the proposed 'common resolution' above breaks down, and another explanation is needed." However, that wikipedia page's alternative explanations work well. Since you found that page, I would accept an answer that pointed there. – Cannoliopsida Jul 18 '13 at 17:55