Let $u_1$ be a random real number uniformly distributed between $0$ and $1$.
Let $u_k$ be a random real number uniformly distributed between $0$ and $a\times{u_{k-1}}$, for $k>1$ and fixed positive real number $a$.
What is the expected minimum value of $n$ such that $\sum\limits_{k=1}^n u_k>1$, in terms of $a$?
(In other words, on average how many $u$'s are needed for their sum to exceed $1$?)
I did simulations using Excel. For small values of $a$ (e.g. $4$), sometimes the sequence of $u$'s gets "stuck" around very low values, so that many $u$'s are required for their sum to exceed $1$. I suspect the expectation is infinity for low values of $a$, and possibly for all values of $a$.
This is a variation of a simpler question: On average, how many uniformly random real number between $0$ and $1$ are needed for their sum to exceed $1$? The answer to that question is $e$. I tried to apply similar methods to my question, to no avail.