4

Let $L_n$ be the largest contiguous heads sequence in $n$ coin tosses with $p$ probability of having head. It is known that

$$ \forall \epsilon>0 \lim\limits_{n\to\infty}\mathbb{P} \left(\left|\frac{L_n}{\log_{1/p}n}-1\right|>\epsilon \right)=0 $$

I have the following problem. Suppose that now, for each n, the probability of having an head is $p_n$, that is it depends on the number of coin tosses. Suppose further that $p_n\rightarrow 1$ as $n\rightarrow\infty$. More precisely, $p_n=1-\frac{\gamma}{n}$ with $\gamma>0$. What can be said now about $L_n$ ? My guess is that it has an exponential or a linear growth, so that

$$ L_n = O_p\left(e^n\right) $$

or, alternatively,

$$ L_n=O_p(n) $$

instead of $L_n = O_p\left(\log n\right)$ as for the case in Schilling (1990).

1 Answers1

1

Surely, $L_n$ cannot grow exponentially, as it does not exceed $n$; this is just a weird mistake. But the second guess is correct: the order of $L_n$ is exactly $n$. However, $L_n/n$ converges not to a constant, but to a distribution. A quick way to see this is to remark that $L_n = n$ with probability $(1-\gamma/n)^n \to e^{-\gamma}$. But it is possible to identify the limit distribution explicitly.

Let $T^n_k$ be the moment when $k$th tail appears, and $X_n = \#\{k\le n: T_k^n\le n\}$ be the number of tails in the first $n$ flips. It has a binomial $(n,\gamma/n)$ distribution. It is well known that $X_n$ converges in distribution, as $n\to\infty$, to a random variable $X$ having Poisson distribution with parameter $\gamma$.

Further, it is easy to see that, given $X_n = k$, the set $\{T^n_1,T^n_2,\dots,T^n_k\}$ is uniformly distributed among $k$-subsets of $1,\dots,n$. Therefore, as $n\to\infty$, the distribution of $(T^n_1/n,T^n_2/n,\dots,T^n_k/n)$ given $X_n=k$ converges to the uniform distribution on $$\mathbb{S}_k = \{(x_1,\dots,x_k): 0\le x_1\le \dots\le x_k\le 1\}.$$ The lengths of head runs in first $n$ flips are just $T^n_i-T^n_{i-1}-1$, $i\ge 1$, and $n-T^n_{X_n}$. Therefore, given $X_n=k$, these lengths, divided by $n$, converge in distribution to $(S_1,\dots,S_{k+1}) = (\tau_1, \tau_2 - \tau_1,\tau_3-\tau_2,\dots,\tau_k - \tau_{k-1},1-\tau_k)$, where the vector $(\tau_1,\dots,\tau_k)$ is uniformly distributed over $\mathbb S_k$. As a result, $L_n/n$ given $X_n=k$ converges in distribution, as $n\to\infty$, to $$ M_k = \max\{S_1,\dots,S_{k+1}\}, $$ where the vector $(S_1,\dots,S_{k+1})$ is uniformly distributed over the simplex $\mathbb T_{k+1} = \{(x_1,\dots,x_{k+1}) : x_i\ge 0, x_1+\dots + x_{k+1} = 1\}$.

Hence we can find the distribution of $L$. It has atom in $1$ (corresponding to the case $X = 0$) of size $e^{-\gamma}$ and density on $[0,1]$ of the form $$ f(x) = \sum_{k=1}^\infty \mathsf{P}(X = k) f_k (x) = \sum_{k=1}^\infty\frac{\gamma^k e^{-\gamma}}{k!} f_k(x), $$ where $f_k(x)$ is the density of $M_k$. It is a matter of the inclusion-exclusion argument to show that $$ f_k(x) = (k+1)k\sum_{i=0}^{\lfloor 1/x\rfloor-1} (-1)^{i}{k\choose i}(1-ix)^{k-1}. $$ (Note that $f_k(x) = 0$ for $x<1/(k+1)$.) Finally we get $$ f(x) = \sum_{k=1}^\infty\frac{\gamma^k e^{-\gamma}(k+1)}{(k-1)!}\sum_{i=0}^{\lfloor 1/x\rfloor-1} (-1)^{i}{k\choose i}(1-ix)^{k-1}. $$

It is possible to rearrange the sum and to write it somewhat shorter. I'll write the ultimate expression for the cdf $F(x) = \mathsf{P}(L<x)$ rather than that for the density, as it looks nicer and includes both continuous part and the jump part: $$ F(x) = e^{-\gamma}\left(\sum_{i=0}^{\lfloor 1/x\rfloor} \frac{\big(\gamma(ix-1)\big)^i}{i!}e^{\gamma(1-ix)} - \sum_{i=0}^{\lfloor 1/x\rfloor - 1} \frac{\big(\gamma((i+1)x-1)\big)^i}{i!}e^{\gamma(1-(i+1)x)}\right),\quad x\ge 0. $$ Here $\sum_{i=0}^{-1} = 0$ and $0^0 = 1$.

zhoraster
  • 25,481
  • I guess $Z^n(t)$ is $Z^n(t) = X^n_{\lfloor t,n\rfloor}$ instad of $Z^n(t) = X^n_{\lfloor rn\rfloor}$, right? Secondly, I do not understand the meaning of $\Pi_{\gamma}\left(1\right)$ in $L = \max\left{\max_{1\le i\le\Pi_\gamma(1)} E_i, 1 - \tau_i\right}.$ – AlmostSureUser Oct 20 '16 at 09:04
  • Third, I do not get what you mean with "the joint distribution of $E_1,\dots, E_{k}$ and $\tau_k$ is uniform over the simplex $\mathbb{T}_{k+1}$". – AlmostSureUser Oct 20 '16 at 09:10
  • @AlmostSureUser, I have rewritten this part. Please check if it is clearer now. – zhoraster Oct 20 '16 at 11:24