4

You have a random number generator that generates random uniformly distributed real numbers from $0$ to $1$. Generate a number, then subtract a new number, then add a new number, then subtract a new number, and so on. On average, how many numbers must you generate so that their sum exceeds $1$?

(This is a variation of this question.)

My unsuccessful attempt to answer my question involves first changing my question so that the generator generates a random uniformly distributed discrete number among $0.01, 0.02, 0.03, ..., 1.00$, then take the limit as the number of discrete numbers from $0$ to $1$ approaches infinity. But the calculation seems unfeasible.

Dan
  • 22,158
  • Does going more negative than $-1$ count as well, or only positive excursions? – Ross Millikan Sep 16 '21 at 13:17
  • Going more negative than -1 does not count (although that would be an interesting variation). – Dan Sep 16 '21 at 13:20
  • If $1$ is exceeded for the first time after number $n$ is added to the sum, then $n$ is necessarily odd, since this event can occur only occur when adding a number rather than subtracting one. So we can write $n = 2m+1$ for some $m\geq 1$. Adding a uniform number on $[0,1]$ and then subtracting one is the same as adding a random number from a tent distribution on $[-1,1]$. So the $n$ we seek occurs after adding together $m$ tent random variables on $[-1,1]$ followed by one uniform RV on $[0,1]$. – John Barber Sep 16 '21 at 15:35
  • 1
    After doing a little numerics and some reading, I suspect this average may not exist. Since 1D random walks are recurrent, any one walk will ultimately exceed 1 almost surely. However (see these notes), the probability of arriving at a given point after $n$ steps scales as $1/\sqrt{n}$, and so the average value of $n$ scales as $\sum_n n/\sqrt{n}$, which diverges. Furthermore, trying to estimate the average value of $n$ by repeatedly summing in Mathematica gives strange results that don't seem to converge. – John Barber Sep 16 '21 at 16:01
  • 2
    I did 100 trials of adding/subtracting random numbers on $[0,1]$ until the sum exceeded 1. These are the numbers of steps required: {5, 21, 1249, 49, 7, 11, 15, 11, 71, 55, 3, 93, 7, 3, 3, 187, 1253, 17, 3, 19, 7, 75, 15, 5, 17, 5, 27, 161, 29, 5, 365, 19, 39, 21, 55, 31, 3, 5, 19, 15, 3, 13679, 7, 395, 3, 19, 13, 3, 5, 29, 6459, 459, 89, 101, 17, 35, 13, 9, 7, 11, 9, 3, 19, 41, 273, 9, 9, 9, 3, 11, 3, 2379, 57, 8167, 3, 5, 25, 21, 7, 1069333, 9, 45, 51, 13, 3, 101, 11, 15, 43, 11, 13, 255, 5, 7, 7, 37, 59, 3, 83, 5} This has all the hallmarks of a heavy-tailed process without finite moments. – John Barber Sep 16 '21 at 16:07
  • If the answer to my question is infinity (or does not exist), which seems plausible after reading John Barber's comments, then the average number of random numbers required to exceed ANY specified positive number (no matter how small) is also infinity. For example, the average number of random numbers required to exceed 0.1 would be infinity. (This is because there is a chance that the alternating sum goes below -0.9 before exceeding 0.1, then subsequently the alternating sum in effect must exceed 1.) – Dan Sep 16 '21 at 22:23
  • @Dan I suspect (but cannot immediately prove) that what you say is true. Incidentally, the average number required to pass either $+1$ or $-1$ appears to be in the neighborhood of 11.4. – John Barber Sep 16 '21 at 23:59
  • Some more numerics I did strongly suggest that the probability that the sum exceeds 1 for the first time after $n$ steps scales as $n^{-3/2}$ for large $n$, so this is indeed a heavy-tailed distribution with no moments. This is interesting because it can be shown that for large $n$ the probability that the sum crosses $1$ at step $n$ without the restriction that this be the first time asymptotes to $P(n) = \sqrt{3/2\pi n}$. A completely different scaling! This shows how much more difficult this problem is than the original version. – John Barber Sep 17 '21 at 03:07
  • 1
    One can approximate the value $S_n$ of this alternating sum after $2n$ terms with a Brownian motion $X(n/6)$, using a functional Central Limit Theorem. Now the first time $T$ that a Brownian motion $X(t)$ hits a fixed value $\alpha > 0$ has a Levy distribution, with probability density that behaves indeed like $x^{-3/2}$ as $x \to \infty$. – Hans Engler Sep 18 '21 at 01:51
  • @HansEngler I assume the 6 arises here as the variance of the "tent distribution" $1 - |x|$? – John Barber Sep 20 '21 at 00:03
  • @JohnBarber Yes, $S_n$ is a sum of $n$ independent tent distributed rv.s. Each of these has $var = 1/6$, and therefore $var(S_n) = n/6$. – Hans Engler Sep 21 '21 at 15:29

0 Answers0