4

I have a sequence $\epsilon_i$ defined recursively for $i\ge 1$ as follows \begin{eqnarray*} \epsilon_1 &=& \frac{1}{p}\\ \epsilon_n &=& \frac{1}{1-(1-p)^n}\left( 1 + \sum_{j=1}^{n-1} \binom{n}{j}(1-p)^{j} p^{n-j}\epsilon_{j}\right) \end{eqnarray*} with $p \in (0,1)$. I want to prove that $\epsilon_i$ asymptotically grows as $\log i$, but all my attempts failed. I am unable to get a closed form for the sequence. I know from plotting $\epsilon_i$ for values 1 to 50 that my claim is true. Any hint on how to solve this problem is welcome. Actually I only need the value for $p=0.5$.

This sequence came up when I computed the expected value for a random variable.

Alex Ravsky
  • 90,434
  • Well, for $p = 0.5$, this simplifies somewhat to \begin{eqnarray} \epsilon_1 &=& 2\ \epsilon_n &=& \frac{1}{1-2^{-n}}\left( 1 + \sum_{j=1}^{n-1} \binom{n}{j}2^{-n}\epsilon_{j}\right) \end{eqnarray} but I assume you already know that. – Arthur Jan 10 '14 at 12:58
  • With $p=\frac12$, if you set $\epsilon_0=0$ then the recursion becomes simply $\epsilon_n = 1 + 2^{-n} \sum_{j=0}^n \binom nj \epsilon_n$. If you further set $E(x) = \sum_{n=1}^\infty \frac{\epsilon_n}{n!}x^n$, then the recursion becomes the functional equation $E(x) = e^x + e^{x/2} E(\frac x2)$. Don't know where to go from there.... – Greg Martin Jan 10 '14 at 20:59
  • You might want to compare $\epsilon_n$ to the $n$th harmonic number, rather than to $\log n$ - it might fit the discrete nature of the problem better. – Greg Martin Jan 10 '14 at 21:06
  • See this also: http://math.stackexchange.com/questions/26167/expectation-of-the-maximum-of-iid-geometric-random-variables/26214#26214 –  Jan 11 '14 at 16:22

1 Answers1

2

Update: For fixed $n$, the function $x\mapsto 1-(1-q^x)^n$ is decreasing in $x>0$, and therefore $$\epsilon_n\geq \int^\infty_0 1-(1-q^x)^n\,dx\geq \epsilon_n-1.$$

Now, with a change of variables $x=w \log(n)$, the integral becomes $$\int^\infty_0 1-(1-q^x)^n\,dx=\log(n) \int_0^\infty 1-\left(1-{1\over n^{w \log(1/q)}}\right)^n\,dw.$$ The integrand converges to 1 if $w < 1/\log(1/q)$ and zero if $w > 1/\log(1/q)$. By dominated convergence $$\int_0^\infty 1-\left(1-{1\over n^{w \log(1/q)}}\right)^n\,dw\to1/\log(1/q),$$ and we deduce that asymptotically $$\epsilon_n\approx \log(n)/\log(1/q).$$


Start with $n$ independent coins with probability $p$ of showing heads. Toss them all, and put aside those that show "heads". Retoss the remaining coins, and repeat until all coins show "heads". Your $\epsilon_n$ is the expected number of trials in this experiment.

Here is an explicit formula: $$\epsilon_n=\sum_{j\geq 1}{n\choose j}(1-q^j)^{-1}(-1)^{(j+1)}=\sum_{k\geq 0}[1-(1-q^k)^n], $$ where $q=1-p$. Perhaps the asymptotics can be derived from this expression.

In the reference given below the author says "A graph of the data in TABLE 2 strongly suggests that E[Y] (i.e., your $\epsilon_n$) is a logarithmic function of $n$." But no proof is offered.

Tossing Coins Until All Are Heads by John Kinney in Mathematics Magazine, Vol. 51, No. 3 (May, 1978), pp. 184-186.