1

I have the following problem. I toss coins until I get a $n$ heads and then stop. The complication is that the probability of getting a head is itself a uniform random variable in the range $[0,1]$. The probability of getting a head is however the same for all coins.

How can you get the probability distribution function of the time to get $n$ heads under these circumstances?

3 Answers3

2

Suppose $X_1,X_2,X_3,\ldots$ are conditionally independent given $R$, and $R$ is uniformly distributed on $[0,1]$, and $\Pr(X_k=1\mid R)= R$ and $\Pr(X_k=0\mid R)=1-R$.

Let $K=\min\{k : X_1+\cdots+X_k=n\}$. Then $$ \begin{align} & \Pr(K=k\mid R) \\[10pt] = {} & \Pr(\text{exactly $n-1$ successes in the first $k-1$ trials}\mid R) \\ & {} \times\Pr(\text{success on the $k$th trial}\mid R) \\[10pt] = {} & \binom{k-1}{n-1} R^{n-1} (1-R)^{k-n} \cdot R = \binom{k-1}{n-1} R^n (1-R)^{k-n}. \end{align} $$ Then $$ \Pr(K=k)=\mathbb E(\Pr(K=k\mid R)) = \int_0^1 \binom{k-1}{n-1} r^n(1-r)^{k-n}\,dr. $$ By this previous answer, $$ \int_0^1 r^n(1-r)^{k-n}\,dr = \frac{1}{(k+1)\dbinom k n}. $$

After simplifying, we have $$ \frac{n}{k(k+1)} \text{ for }k=n,n+1,n+2,\ldots. $$

1

Let $X$ be the probability of getting a head; it is a uniform random variable. Let $T$ be the time to get $n$ heads.

If you knew what $X$ was, say $X=x$, then the distribution would be the sum of $n$ geometric random variables with parameter $1-x$. So this is the conditional probability distribution for $T \mid X=x$.

So, the probability distribution for $T$ is $f_T(t) = f_{T \mid X=x}(t)\cdot f_X(x)$ by the definition of conditional probability, where $f_{T \mid X=x}(t)$ is the sum of the $n$ geometric random variables, and $f_X(x)=1$ is the uniform distribution.

angryavian
  • 89,882
0

Opps.

It seems I misread the question. You can skip the old answer and go straight to the new one.

Let $H$ be the distribution of the probability of getting a head, for each new coin tossed. For simplicity we'll assume an unlimited source of coins and a different one used each toss; so the bias of each coin is independently determined with some probability distribution function; $f_H(p)$.

The usual geometric distribution of $n-1$ tails before the first head is modified to:

$$\Pr(N=n) = \int_0^1 p\;f_H(p) \operatorname{d}p \times \prod_{k=1}^{n-1} \left(1-\int_0^1 p\;f_H(p)\operatorname{d}p\right) $$

Or equivalently $$\Pr(N=n) = E[H]\left(\prod_{k=1}^{n-1} (1-E[H])\right)$$

And when this is a uniform distribution this becomes :

$f_H(p) = 1\cdot\operatorname{\bf 1}_{[0,1]}(p), E[H]=\frac 12$

$$\therefore \Pr(N=n) = (\frac 12)^n$$


If "the probability of getting a head is however the same for all coins." then you may as well be selecting one coin and using it for all tosses.

So in this case: $$\begin{align}\Pr(N=n) & = \int_0^1 (1-p)^{n-1}\;p\;f_H(p)\operatorname{d}p \\ & = \int_0^1 q^{n-1}(1-q)\operatorname{d} q \\ & = \int_0^1 q^{n-1}-q^n\operatorname{d} q \\ & = \left[\frac {q^n}{n}-\frac{q^{n+1}}{n+1}\right]_{q=0}^{q=1} \\ & = \frac {1}{n}-\frac{1}{n+1} \\ & = \frac{1}{n(n+1)} \end{align}$$

Graham Kemp
  • 129,094