12

100 participants have a fair coin each, on a given round, the not already discarded participants flip their coins, those who flip a tail are discarded from the game, the remaining ones continue to play until nobody is left (everyone has been discarded).

  1. What would be the average number of trials (where each trial consists of a tossing and removing the tails) one would expect from doing this experiment?

  2. Does conditional expectation work for something like this?

I know that each individual coin follows a Geometric distribution, but I am trying to figure out the sum of them to determine the average number of trials for a game like this.

My Logic/Thought Process: I started out trying to think of the probability that a particular coin makes it to round $r$ which is $\frac{1}{2^m}$. I then realized that each coin outcome can be modeled by a Geometric random variables with $p = 0.5$. I am just now unsure how to take the leap from this single case to a case with 100 coins. I presume it has to do with summing the geometric random variables, but I am not sure.

  • 1
    What have you tried? What is the number of trials if you started with 1 coin? 2 coins? 3 coins? – Calvin Lin Dec 11 '20 at 16:04
  • @CalvinLin just added my thought process above –  Dec 11 '20 at 16:13
  • What is the expected number of trials if you started with 2 coins? – Calvin Lin Dec 11 '20 at 16:13
  • Hint: per Calvin, work out smaller cases (1, 2, 3 coins) with conditional expectation. – Neat Math Dec 11 '20 at 16:16
  • you expect it to take 1 trial to remove both coins? Please show your work. – Calvin Lin Dec 11 '20 at 16:19
  • @CalvinLin $\sum^2_{x=1} \frac{2!}{1! * 1!} (.5)^x (.5)^{2-x}$ –  Dec 11 '20 at 16:22
  • You're doing a summation on x, but the RHS has no x terms. So you expect it to take 22.5*.5 = 1 trial? Are you sure? Please show your work and explain what you are doing. – Calvin Lin Dec 11 '20 at 16:24
  • @CalvinLin Just fixed it. I tried using a binomial, but now that I think about it, the sum of geometric random variables yields a negative binomial, so should I be doing something with that instead? –  Dec 11 '20 at 16:26
  • @user839334 A) Are you sure of your claim? B) Even if your claim is true, how does it apply to this setup? Please show your work / reasoning. C) You seem to be mixing up several concepts, like expected value, . Without a sufficiently detailed explanation , I don't know what you're thinking of and where the error is. – Calvin Lin Dec 11 '20 at 16:35
  • Just to spell out the connection with the proposed duplicate, rather than taking a sum of the random variables (outcomes for individual coins), you want the (expected value of the) maximum of those variables. – hardmath Dec 12 '20 at 17:52

4 Answers4

7

This is essentially equivalent to computing the expected value of the maximum of $n=100$ iid geometric random variables, for $p=\frac12$

(BTW: The linked question includes the recursion given by @saulspatz's answer)

There is no closed form solution, but this approximation for large $n$ (with bounds) is given:

$$E_n \approx \frac{1}{2} + \frac{1}{\lambda} H_n$$

where $\lambda = - \log(1-p)=0.69314718\cdots$ and $H_n$ is the harmonic number.

For example, for $n=3$ this gives $E_3 \approx 3.14494$ , very near the exact $E_3=22/7=3.14285$

For $n=100$ this gives $E_{100} \approx 7.98380382$.

More in "Yet another application of a binomial recurrence order statistics", W. Szpankowski; V. Rego, Computing, 1990, 43, 4, 401-410.

leonbloy
  • 63,430
6

I doubt that there's a simple expression for the expectation. Let $E_n$ be the expected number of trials when $n$ coins remain, so that we are asked to compute $E_{100}$. We know that $E_0=0$ and that $E_1=2$. Now $$E_2=1+\frac14E_2+\frac12E_1+\frac14E_0$$ because we have to make one trial, and with probability $\frac14$ we throw two heads and still have two coins, with probability $\frac12$ we throw a head and a tail, and with probability $\frac14$, we throw two tails, and the experiment ends. This gives $E_2=\frac83$.

We can continue in this manner: $$E_3=1+\frac18E_3+\frac38E_2+\frac38E_1+\frac18E_0$$ which gives $E_3=\frac{22}7$ if I'm not mistaken.

One could easily write a computer program to work back to $E_{100}$, but it would be easier to proceed by simulation.

EDIT

I wrote the script I suggested. The exact value if a fraction whose numerator has $894$ decimal digits and whose denominator has $893$. The approximate value is $7.98380153515692$.

saulspatz
  • 53,131
  • I got those initial values too. I agree that this is likely a computing process (though OP doesn't seem to have a good enough grasp of statistics) – Calvin Lin Dec 11 '20 at 16:24
  • @saulspatz - could this be written in summation notation? –  Dec 11 '20 at 17:04
  • 1
    @user839334 $$E_n=\frac{1}{2^n-1}\left(2^n+\sum_{k=0}^{n-1}\binom nkE_k\right)$$ – saulspatz Dec 11 '20 at 17:08
2

Searching OEIS with @saulspatz first values, we can find that:

$$E_n = \frac{a(n)}{b(n)}$$

where $a(n)$ is OEIS A158466 and $b(n)$ is OEIS A158467. At OEIS A158466 you can find also the following formulas:

$$E_n = -\sum_{k=1}^n (-1)^k \frac{{n \choose k}}{1-\frac{1}{2^k}}$$

$$E_n = \sum_{k=1}^{\infty} k \left(\left(1-\frac{1}{2^k}\right)^n - \left(1-\frac{1}{2^{k-1}}\right)^n\right)$$

and thus (see here):

$$E_{100} \approx 7.983801535$$

1

Set $N_0=100$ and take $N_k$ to be the number of coins that remain after the $k^\text{th}$ trial in this process. So we can say something like $$P(N_1=81|N_0=100)={100 \choose 19}\Big(\frac{1}{2}\Big)^{100}$$

Now for $i\in \{0,1,\ldots, 100\}$ and $j\in \{0,1,\ldots ,i\}$ we have $$P(N_{k+1}=j|N_{k}=i)={i \choose j-i}\Big(\frac{1}{2}\Big)^i$$ Notice $\{N_k\}_{k=0}^{\infty}$ is an absorbing Markov chain with $0$ as an absorbing state. You're looking to compute the expected number of trials in this random process before being absorbed in state $0$ starting from state $100$. There are many ways to compute this expected value, the most efficient is probably by making use of the fundamental matrix which you can learn about here

Matthew H.
  • 9,191