1

Suppose I flip two fair coins at the same time repeatedly until I have seen at least one head on each of the coins. What is the expected number of times I would have to perform these flips until that condition is met?

So recursively, I've realized that the probability of meeting the condition by spin $k$ is: $$\mathsf P(\text{one head on one coin only in previous spins})\\ *\mathsf P(\text{getting a head on the other coin on this spin)}\\ +\mathsf P(\text{no heads on previous spin})*\mathsf P(\text{getting both heads on this spin})$$ Which leads me to: $$\text{P(k)} = 2*(1-\frac{1}{2})^{k-1}*(1-(1-\frac{1}{2})^{k-1})*(1)*(\frac{1}{2}) + (1-\frac{1}{2})^{2k-2}*p^{2}$$ Am I on the right track?

Parcly Taxel
  • 103,344
  • 1
    Any thoughts? What is the probability that it takes one try? Two? Three? – lulu Oct 24 '16 at 21:14
  • I tried that approach and it seemed to me I'd have to use Markov chains to obtain an answer somehow. – Brent Daniels Oct 24 '16 at 21:17
  • 1
    Markov chains? Definitely overkill. Though you can do the problem recursively without much fuss. Edit your question to include your calculation of the probabilities. – lulu Oct 24 '16 at 21:19
  • Hint: to do it recursively, let $E$ be the answer and consider the first trial. It either works or it leaves you back in the starting state. – lulu Oct 24 '16 at 21:20
  • @lulu Doesn't this question have an intermediate state also? (one coin has a head seen) – Erick Wong Oct 24 '16 at 22:13
  • You will find the answer here: http://math.stackexchange.com/questions/971214/expectation-of-maximum-of-two-geometric-random-variables –  Oct 24 '16 at 22:54
  • @ErickWong You are correct, I read too hastily. – lulu Oct 24 '16 at 22:59
  • @bof Indefinite. If you flip the coins multiple times you can see multiple heads on each coin. (Well, okay, the same head may makes multiple showings.) – Graham Kemp Oct 24 '16 at 23:45
  • @bof Yes. The title is misleading. The body asked for the expected count of parallel trials until one success is viewed in each sequence of independent geometric random variables. (The expected max). – Graham Kemp Oct 25 '16 at 00:23
  • @BrentDaniels your latex is leaking off the side of the page – Human Oct 25 '16 at 02:50

4 Answers4

2

$P(n)$ is the probability that at least one coin has no heads until the $n$-th toss while the other coin has at least one head somewhere in those $n$ tosses.   By the principle of inclusion and exclusion

$$\begin{align}P(n) ~=~& 2(1-p)^{n-1}p\cdot (1-(1-p)^n)- (1-p)^{2n-2}p^2 \\[1ex] ~=~& \frac{2^{n+1}-3}{4^n}\end{align}$$

Then you want $$\sum_{n=1}^\infty \frac{n\,(2^{n+1}-3)}{4^n}$$


But that's the hard way.

It's the expected time until one of the coins shows heads, plus the expected time until the other shows heads times the conditional probability that only one shows heads when given at least one does.

The expected time until at least one coin shows heads is: $x=\bbox[white]{\color{white}{4/3}}$ throws.

Given that at least one coin shows heads, either both do with (conditioned) probability $p_2=\bbox[white]{\color{white}{1/3}}$, or only one does with probability $p_1=\bbox[white]{\color{white}{2/3}}$.

If only one coin shows heads, the expected time until the other does is: $y=\bbox[white]{\color{white}{~~2~~}}$ throws.

Then the expected time until both coins have shown heads at least once is: $x+p_1 y=\bbox[white]{\color{white}{8/3}}$

Graham Kemp
  • 129,094
2

Answer to this Question

The probability of getting a head on one coin or the other is $\frac34$ so the average duration is $\frac43$ flips until the first head. Probability is $\frac13$ that it ends there with two heads. Otherwise, the probability of getting a head on the other coin is $\frac12$, so the average duration is $2$ flips to get a head on the second coin.

Therefore, the average number of flips is $\frac43+\frac23\cdot2=\frac83$.


Generalization

Let's consider the case where one coin has probability $p$ for heads and the other has a probability $q$ for heads.

The probability that one coin or the other comes up heads is $$ 1-(1-p)(1-q)=p+q-pq\tag{1} $$ So the average duration for the first head is $$ \frac1{p+q-pq}\tag{2} $$ In the case that one or the other comes up heads, the probability that it is the first is $\frac{p(1-q)}{p+q-pq}$, then the average duration until the second comes up heads is $\frac1q$; the probability that it is the second is $\frac{q(1-p)}{p+q-pq}$, then the average duration until the first comes up heads is $\frac1p$; the probability that it is both is $\frac{pq}{p+q-pq}$, but then we're done.

Therefore, the average duration is $$ \begin{align} &\frac1{p+q-pq}+\overbrace{\frac{p(1-q)}{p+q-pq}}^{\substack{\text{probability that}\\\text{coin $p$ is first}\\\text{without coin $q$}}}\frac1q+\overbrace{\frac{q(1-p)}{p+q-pq}}^{\substack{\text{probability that}\\\text{coin $q$ is first}\\\text{without coin $p$}}}\frac1p\tag{3}\\ &=-\frac1{p+q-pq}+\frac{p+q-pq}{p+q-pq}\frac1q+\frac{p+q-pq}{p+q-pq}\frac1p\tag{4}\\ &=\bbox[5px,border:2px solid #C0A000]{\frac1p+\frac1q-\frac1{p+q-pq}}\tag{5}\\ \end{align} $$ In $(4)$, we add $\frac1{p+q-pq}$ to the second and third terms while subtracting $\frac2{p+q-pq}$ from the first.

robjohn
  • 345,667
2

Generalizing slightly, let's say that a single flip of the $i^{\text{th}}$ coin ($i=1,2$) comes up heads with probability $p_i\gt0,$ tails with probability $q_i=1-p_i.$ Let $X_i$ be the number of times the $i^{\text{th}}$ coin is flipped, up to and including the first time it comes up heads. We want to find the expected value of the random variable $X=\max(X_1,X_2).$

Since $$X=\max(X_1,X_2)=X_1+X_2-\min(X_1,X_2),$$ by linearity we have $$E(X)=E(X_1)+E(X_2)-E(\min(X_1,X_2))$$ where $$E(X_i)=\sum_{n=0}^\infty q_i^n=\frac1{1-q_i}=\frac1{p_i}$$ and $$E(\min(X_1,X_2))=\sum_{n=0}^\infty(q_1q_2)^n=\frac1{1-q_1q_2},$$ so that $$E(x)=\frac1{p_1}+\frac1{p_2}-\frac1{1-q_1q_2}=\boxed{\frac1{p_1}+\frac1{p_2}-\frac1{p_1+p_2-p_1p_2}}\ .$$

Likewise, for three coins, you could use the identity $$\max(X_1,X_2,X_3)=X_1+X_2+X_3-\min(X_1,X_2)-\min(X_1,X_3)-\min(X_2,X_3)+\min(X_1,X_2,X_3),$$ and so on.

bof
  • 78,265
  • (+1) Nice observation about $E(\max(X_1,X_2))$! Another approach is by inclusion-exclusion: the probability that at least one occurs is $p_1+p_2-p_1p_2$, so the expected duration until the first happens, $E(\min(X_1,X_2))=\frac1{p_1+p_2-p_1p_2}$. – robjohn Oct 25 '16 at 10:32
0

There are 2 states that keep you off the goal

A: You haven't seen any heads

B: You've seen a head on one of the coins

In the first k flips you may be in state A, in the rest n-k-1 you will be in state B, and in the n-th flip you achieve the goal. k goes from 0 to n-1.

To keep A you mustn't get head so for every flip p=1/4, to get in state B you need exactly one head so p=2*1/4=1/2, to keep state B also p=1/2, and finally for n-th flip if k=n-1 p=1/4 otherwise p=1/2 $$P(n)=\sum_{k=0}^{n-2}(\frac 1 4)^k(\frac 1 2)^{n-k}+(\frac 1 4)^n=\sum_{k=0}^{n-2}(\frac 1 2)^{n+k}+(\frac 1 4)^n=\frac{2^{n-1}-1}{4^{n-1}}+(\frac 1 4)^n=\frac{2^{n+1}-3}{4^n}$$ $$E=\sum_{n=1}^{∞}nP(n)=2\sum_{n=1}^{∞}\frac n {2^n}-3\sum_{n=1}^{∞}\frac n {4^n}=2*2-3*\frac 4 9=\frac 8 3$$