14

Let $u_1$ be a random real number uniformly distributed between $0$ and $1$.

Let $u_k$ be a random real number uniformly distributed between $0$ and $a\times{u_{k-1}}$, for $k>1$ and fixed positive real number $a$.

What is the expected minimum value of $n$ such that $\sum\limits_{k=1}^n u_k>1$, in terms of $a$?

(In other words, on average how many $u$'s are needed for their sum to exceed $1$?)

I did simulations using Excel. For small values of $a$ (e.g. $4$), sometimes the sequence of $u$'s gets "stuck" around very low values, so that many $u$'s are required for their sum to exceed $1$. I suspect the expectation is infinity for low values of $a$, and possibly for all values of $a$.

This is a variation of a simpler question: On average, how many uniformly random real number between $0$ and $1$ are needed for their sum to exceed $1$? The answer to that question is $e$. I tried to apply similar methods to my question, to no avail.

Dan
  • 22,158
  • @TonyK Thanks, I fixed it. – Dan Feb 19 '23 at 13:46
  • 2
    Did you notice any change of behaviour around $a=\mathrm e$? You're adding $\log a$ plus the logarithm of a variable uniformly drawn from $[0,1]$ in each iteration; the expectation of that logarithm is $-1$; so I'd expect the expected time to be finite for $\log a\gt1$ and infinite for $\log a\lt1$. – joriki Feb 19 '23 at 14:09
  • @joriki I didn't notice any sudden change of behavior around $a=e$, but I just used Excel, which is rather crude. – Dan Feb 19 '23 at 14:28
  • 4
    It wouldn't be "sudden" in the sense of being obvious from short runs – but if your runs are long enough, the $u_k$ should diverge to $\infty$ for $a\gt\mathrm e$ and to $0$ for $a\lt\mathrm e$. – joriki Feb 19 '23 at 14:33
  • 1
    @joriki: I'm doing some long runs (100 million trials, using double-precision arithmetic) for various values of $a$. There does seem to be a phase change somewhere around $a=e$, but it's very hard to pin down, because even with $a$ as high as $2.74$ we find trials that end up with $u_k=0$ (and so they never halt). So I don't think numerical experiments can get very far. – TonyK Feb 19 '23 at 16:24
  • @joriki "You're adding $log a$ plus the logarithm of a variable uniformly drawn from $[0,1]$ in each iteration" Actually only $u_1$ is uniform in $[0,1]$, the others are uniform in $[0, a u_{k-1}]$ – leonbloy Feb 19 '23 at 21:03
  • 1
    @leonbloy: I'd implicitly transformed the problem to $u_k=x_{k-1}au_{k-1}$, and thus $u_k=u_1\prod_{i\lt k}(ax_i)$ and $\log u_k=\log u_i+\sum_{i\lt k}\left(\log a+\log x_i\right)$, with the $x_i$ uniformly drawn from $[0,1]$. – joriki Feb 19 '23 at 21:23
  • 1
    The expectation diverges for $a<e$ and converges for $a>e$. The case $a=e$ is quite interesting, and I only have a naïve guess that the expectation is still finite in this case (which is contrary to the fact that the return time of a simple random walk has infinite expectation). – Sangchul Lee Feb 22 '23 at 22:57
  • @SangchulLee I posted a question about the case with $a=e$ on MO. – Dan Feb 23 '23 at 13:09
  • @SanchulLee Please could you explain why the expectation diverges for $a<e$ and converges for $a>e$. At first I thought I understood why, but now I don't. I editted my MO question to explain my thoughts about this. Thanks. – Dan Feb 26 '23 at 06:34
  • @SangchulLee I saw your answer at the MO question; thank you! – Dan Feb 27 '23 at 06:17
  • @Dan, No problem! Also, Now I am beginning to suspect that the expected number of terms when $a=e$ is infinite. I still have only half-baked ideas about this, but will definitely share some if I come up with more convincing arguments. – Sangchul Lee Feb 27 '23 at 06:22

1 Answers1

0

Let $X_i$ be i.i.d. Uniform[0,1], and $S_n=X_1+aX_2+...+a^{n-1}X_n$. Your question is: when $\mathbb{E}\tau<\infty$ where $$ \tau=\min\{n: S_n\ge 1\}. $$ Claim: if $0\le a<1$, then $S=\lim_n S_n<\infty$ with a positive probability; hence $\mathbb{E}\tau=\infty$.

Proof: let (non-random) $N$ be so large that $a^{N}+a^{N+1}+a^{N+2}+\dots=\frac{a^N}{1-a}<1/2$; such an $N$ exists since $a<1$. Now, with some positive probability $S_N<1/2$; and at the same time $$ S-S_N=a^{N} X_{N+1}+a^{N+1} X_{N+2}+...\le a^{N}+a^{N+1}+a^{N+2}+\dots<1/2 $$ As a result, $S<1$.

Next, for $a\ge 1$, $\mathbb{E}\tau<\infty$ as $S$ is clearly monotone in $a$ and we know the answer for $a=1$. In fact, one can compute $\mathbb{E}\tau$ in the same spirit as it is done for $a=1$: indeed, $$ \mathbb{E}\tau=1+\sum_{n=1}^\infty \mathbb{P}(\tau>n)=1+\sum_{n=1}^\infty \mathbb{P}(S_n<1) $$ The collection of $(x_1,...,x_n)$ such that $x_1+ax_2+...+a^{n-1}x_n$, $x_i\ge 0$, is a stretched simplex (by $a^{k-1}$ at side $k=1,2,\dots,n$), hence its volume is $$ \frac{1}{n!}\cdot 1 \cdot \frac{1}{a} \cdot \frac{1}{a^2} ... \cdot \frac{1}{a^{n-1}} $$ so $$ \mathbb{E}\tau=1+\sum_{n=1}^\infty \frac{a^{-n(n-1)/2}}{n!} $$ enter image description here

van der Wolf
  • 2,054
  • I think we're talking about different questions. My question is to find $\mathbb{E}(\min{n:X_1+aX_1X_2+a^2 X_1X_2X_3+\ldots+a^{n-1}X_1X_2X_3\cdots X_n>1})$. – Dan May 30 '23 at 08:29
  • In this case, the answer is the following. $X_1....X_n=e^{n T_n}$ where $T_n=\frac{1}{n}(\ln X_1 +\dots+\ln X_n)$. By the strong law, $T_n\to \mathbb{E}\ln(X_1)=-1$ a.s. So the $n$th term of the sum behaves like $\gamma^n$ where $\gamma=a e^{-1}$. As a result, for $a<e$, with a positive probability the sum never exceeds $1$, while in the other case it reaches $1$ quite quickly. In the first case $\mathbb{E}\tau=\infty$ but it's $<\infty$ in the second case. – van der Wolf May 30 '23 at 09:52