5

If I offered to give you \$10 but steal $5 from you, you should take that deal. You should take it once, and again, and again. You will always gain money no matter how many times you take it and the ratio of money gained to money lost is constant.

But, what if you took the deal an infinite number of times? You will have both gained and lost an infinite amount of money. It's not as if you can have gained 10∞ and lost 5∞. The infinities have to cancel out, right?

But, then, at each step, the ratio of money gained to money lost is unchanged but, after an infinite amount of time, it is changed. That's clearly impossible, so what actually happens then?

What am I thinking about wrong here?

Kcris
  • 51
  • how familiar are you with summation of infinite series, and our ability to rearrange terms to get the same sum? – Calvin Lin Dec 11 '20 at 15:37
  • 1
    You cannot actually take the deal infinitely many times but you can consider the limit as the number of times you take it goes to infinity. – fes Dec 11 '20 at 15:45

2 Answers2

1

This is the Ross-Littlewood paradox. The resolution I favor is the third one in the Wikipedia article; if I take the deal an infinite number of times, the amount of money I have remaining depends on the details of exactly which $5$ dollars you choose to steal from me at each step. This is surprising because if I only take the deal finitely many times it doesn't matter. Depending on exactly which dollars you choose to steal from me, I can end up with any finite amount of money left (including zero) or an infinite amount of money left.

Edit: Among other things, this means calculating the ratio of money gained to money lost is not a good way to understand what actually happens after taking the deal an infinite number of times.

Here's an example. Let's say all the dollar bills involved have serial numbers, starting $1, 2, 3, \dots$, and

  • whenever the you give me dollar bills you give me the ones with the smallest serial numbers you haven't given me yet, and also
  • whenever you take dollar bills, you take from me the ones with the smallest serial numbers.

Then it's not hard to see that you eventually steal every dollar bill from me, which means at the end I end up with no money. This is despite the fact that at every finite step I end up with more and more money!

On the other hand, suppose I changed the second bullet point to "whenever you take dollar bills, you take from me the ones with the largest serial numbers." Then it's not hard to see that I end up with infinitely many dollars. I'll leave it as an exercise to figure out how to modify this so that I end up with any particular finite amount of dollars.

(This is not just a mathematical curiosity either; it can be turned into an example where a limit of integrals is not an integral of (pointwise) limits, which explains why the hypotheses of the dominated convergence theorem are necessary.)

Qiaochu Yuan
  • 419,620
1

The ratio actually stays 2. This can be shown via the rule of L'Hospital. Being $x$ the number of "deals":$$\lim_{x \rightarrow \infty}\frac{10x}{5x} \stackrel{\text{L'H}}{=} \frac{10}{5}=2 $$ Rule of L'Hospital means here taking the first derivation of the numerator and denominator, yielding the ratio 2.

hitt
  • 11
  • It is really not at all clear that this limit is the correct way to understand "the ratio of the money gained to money lost" if both quantities are infinite. There is an implicit sort of continuity assumption being made here and variants of that continuity assumption are just false in this setting; see my answer. – Qiaochu Yuan Dec 11 '20 at 17:57