2

Suppose we throw $n$ independent dices. After each throw we put aside the dices showing $6$ and then perform a new throw with the dices not showing $6$, repeating the process until all dices have shown $6$. Let $N_n$ be the number of throws before all dice have shown $6$. Set $m_n := E[N_n]$ and let $Y$ denote the number of dices not showing $6$ after the first throw.

I've showed that $Y \sim b(n,\frac 5 6)$, because we can consider the throw as $n$ independent trials with succes probability $\frac 5 6$.

Now I must write a recursive formula for $m_n$ and decide $m_1, \ldots, m_5$.

Using the definition of expectation of a discrete RV:

$E[N_n] = 1 \cdot P(N_n = 1) + 2 \cdot P(N_n = 2) + \ldots$

But this is not very recursive - Any suggestions ?

BCLC
  • 13,459
Shuzheng
  • 5,533
  • 1
    See also http://math.stackexchange.com/questions/26167/expectation-of-the-maximum-of-iid-geometric-random-variables –  Apr 10 '14 at 13:42

3 Answers3

3

Two closed formulas, not based on the recursion you suggest: $$ E(N_n)=\sum_{k\geqslant0}\left(1-\left(1-\left(\frac56\right)^k\right)^n\right) $$ $$ E(N_n)=\sum_{i=1}^n{n\choose i}\frac{(-1)^{i+1}}{1-\left(\frac56\right)^i} $$

Did
  • 279,727
  • How did you do that? You made me curious and some link will be highly appreciated. – drhab Apr 11 '14 at 09:50
  • @drhab The dice are independent and each produces a six at time $k$ or before, with probability $1-(5/6)^k$, hence the $k$th term of the first formula above is $P(N_n\gt k)$. To deduce the second formula, expand each binomial $(1-(5/6)^k)^n$ into powers of $5/6$ and interchange the order of the summations in the double sum this yields. – Did Apr 11 '14 at 10:36
  • I see (tibi gratias ago). It encourages me to investigate earlyer whether the relation between $E(X)$ and probabilities $P(X\geq n)$ can be expoited. It never crossed my mind in this situation. Big focus on recurrence off course, but that is not an excuse. – drhab Apr 11 '14 at 10:45
2

Consider the first throw. Given that we get no six, the expectation is $1+m_{n}$. Given that we get exactly $1$ six, the expectation is $1+m_{n-1}$. Given that we get $2$ sixes, the expectation is $1+m_{n-2}$, and so on. Thus $$m_n=\sum_{k=0}^n (1+m_{n-k})\binom{n}{k}\left(\frac{1}{6}\right)^{k}\left(\frac{5}{6}\right)^{n-k} .$$ We can simplify this a bit, by solving for $m_n$, and combining the "$1$" terms. There may be further simplification.

Remark: We assumed that for example the first throw of the $n$ dice counts as $1$ throw. If throwing $k$ dice counts as $k$ throws, then the problem is simpler.

André Nicolas
  • 507,029
  • Thank you. You are right, but I had intended $(1/6)^k(5/6)^{n-k}$, so I have altered that part instead of the "$m$" part. – André Nicolas Apr 10 '14 at 14:24
  • Thank you @André Nicolas. However "Consider the first throw. Given that we get no six, the expectation is $1+m_n$. Given that we get exactly 1 six, the expectation is $1+m_{n−1}$.", how do I show this is true in terms of the expectation definition and not just intuitively ? – Shuzheng Apr 10 '14 at 14:39
  • To be more specific how does one prove equality between your formula and $E[N_n] = 1 \cdot P(N_n = 1) + 2 \cdot P(N_n = 2) + \ldots$ ? – Shuzheng Apr 10 '14 at 14:47
  • As you can see, we are giving a conditional expectation argument, $E(N_n)=E((N_n|Y))$. I think the intuitive calculation of $E(N_n|Y=k)$ is enough. We throw the dice. If we get $k$ sixes, we have used up a throw, and the additional expectation is by definition $n_{n-k}$. – André Nicolas Apr 10 '14 at 14:50
  • Can one prove $E(N_n) = E((N_n) \mid Y))$ or is this also intuition ? What is this rule called ? – Shuzheng Apr 10 '14 at 14:57
  • It even has a name, the Law of Total Expectation (see, e.g., Wikipedia, or many probability books.) Except that I did not type enough $E$'s above, it is $E(E(N_n|Y))$. – André Nicolas Apr 10 '14 at 15:05
  • Also your formula is circular $m_n$ is defined in terms of $m_n$ ? – Shuzheng Apr 10 '14 at 15:06
  • Sort of. That's why I mentioned solving for $m_n$. Bring the $k=0$ term to the left side. We get $\left(1-\left(\frac{5}{6}\right)^n\right)m_n=\sum_{k=1}^n(\text{the stuff})$. – André Nicolas Apr 10 '14 at 15:11
2

$$m_{n}=\mathbb{E}N_{n}=\sum_{y=0}^{n}\mathbb{E}\left(N_{n}\mid Y=y\right)P\left(Y=y\right)$$ Here $\mathbb{E}\left(N_{n}\mid Y=y\right)=m_{y}+1$, $m_{0}=0$ and applying the distribution of $Y$ this gives: $$m_{n}=\sum_{y=0}^{n}\left(m_{y}+1\right)\binom{n}{y}\left(\frac{5}{6}\right)^{y}\left(\frac{1}{6}\right)^{n-y}$$

and leads to:$$m_{n}=\sum_{y=0}^{n}m_{y}\binom{n}{y}\left(\frac{5}{6}\right)^{y}\left(\frac{1}{6}\right)^{n-y}+1=\sum_{y=1}^{n-1}m_{y}\binom{n}{y}\left(\frac{5}{6}\right)^{y}\left(\frac{1}{6}\right)^{n-y}+m_{n}\left(\frac{5}{6}\right)^{n}+1$$

resulting in: $$m_{n}=\frac{6^{n}+\sum_{y=1}^{n-1}m_{y}\binom{n}{y}5^{y}}{6^{n}-5^{n}}$$


addendum (in order to say something about the distribution of $N_n$)

Give the dice the numbers $1,2,\dots,n$ and let $D_{i}$ denote the number of throws that are needed for the die with number $i$ to produce a $6$.

Note that $N_{n}\leq k$ if and only if $D_{i}\leq k$ for $i=1,\dots,n$. That means that: $$P\left(N_{n}\leq k\right)=P\left(D_{1}\leq k\right)\times\cdots\times P\left(D_{n}\leq k\right)$$ Here $P\left(D_{i}\leq k\right)=1-P\left(D_{i}>k\right)=1-\left(\frac{5}{6}\right)^{k}$ for every $i$ leading to: $$P\left(N_{n}\leq k\right)=\left(1-\left(\frac{5}{6}\right)^{k}\right)^{n}$$ In one of my comments I stated that it would be 'quite a job to find the distribution of $N_{n}$' but I was wrong there. In fact Did made use of this on a very elegant way in his answer. In general if $X$ is an rv taking values on positive integers then: $$\mathbb{E}X=\sum_{k=0}^{\infty}P\left(X>k\right)$$ Applying that here leads to: $$\mathbb{E}N_{n}=\sum_{k=0}^{\infty}P\left(N_{n}>k\right)=\sum_{k=0}^{\infty}\left[1-\left(1-\left(\frac{5}{6}\right)^{k}\right)^{n}\right]$$

drhab
  • 151,093
  • Amazing @drhab. However can you clarify why this is true ? $$m_{n}=\mathbb{E}N_{n}=\sum_{y=0}^{n}\mathbb{E}\left(N_{n}\mid Y=y\right)P\left(Y=y\right)$$ - the last equality. – Shuzheng Apr 10 '14 at 14:35
  • 1
    If $X$ and $Y$ are rv's with a common distribution and $Y$ only takes values in a countable set $C$ then $\mathbb{E}X=\sum_{c\in C}\mathbb{E}\left(X\mid Y=c\right)P\left(Y=c\right)$ where $\mathbb{E}\left(X\mid Y=c\right)$ denotes the expectation of $X$ under condition $Y=c$. This falls under the 'chapter' of conditional expectation. Explaining it here goes too far, but there are lots of places where you can find more information about this issue. Start for instance at: http://en.wikipedia.org/wiki/Conditional_expectation. – drhab Apr 10 '14 at 16:15
  • Thanks so much @drhab. Btw, how could one find the common distribution function of $X,Y$ with the information given here ? Another question: How does one prove $E(N_n \mid Y = y) = 1 + m_y$ without using "intuition" only ? – Shuzheng Apr 10 '14 at 20:11
  • In general common distribution of $X$ and $Y$ cannot be determined on base of this information. In the case mentioned in your question it will be quite a job to find the distribution of $N_n$ and it becomes even more complicated if you go for the common distribution of $N_n$ and $Y$. Be satisfied by $EN_n$ I would say. Set $N_{n}=1+M_{n}$ where $M_{n}$ denotes the number of throws taking place after the first. Under condition $Y=y$ we have the original situation with $y$ dice to be thrown. Under this condition $M_{n}$ and $N_{y}$ must have the same distribution. Then $EN_{n}=1+EM_{n}=1+m_{y}$. – drhab Apr 10 '14 at 21:43
  • Thank once again @drhab your answer and comments really clarified things for me. – Shuzheng Apr 11 '14 at 07:36
  • Don't you miss $y = 0$ in your sum formula ? You start from index $1$ ? – Shuzheng Apr 11 '14 at 08:11
  • I could do that because $m_0=0$ as was mentioned. – drhab Apr 11 '14 at 08:13
  • Yes but you still have $1 \cdot (1/6)^n$ corresponding to $y = 0$ then ? – Shuzheng Apr 11 '14 at 08:22
  • I get the formula, calculated by hand, $$m_n = \frac {\sum^{n-1}_{y=0} (m_y + 1) \binom {n} {y} 5^y} {6^n - 5^n}$$ – Shuzheng Apr 11 '14 at 08:28
  • You were right about missing the case $y=0$ after all. I repaired and added. The recursion expression is not changed however. Have a look. Also note that it gives $m_1=6$ as it should (geometric distribution). – drhab Apr 11 '14 at 09:40
  • "And this leads to:" - Now $m_n$ should be replaced by $(m_n + 1)$ ? You are not multiplying $1$ by the factors ? – Shuzheng Apr 11 '14 at 09:47
  • Ahh, thanks. Is my formula also correct in the comment above ? – Shuzheng Apr 11 '14 at 09:57
  • $\sum_{y=0}^{n}\binom{n}{y}(\frac{5}{6})^{y}(\frac{1}{6})^{n-y}=1$. Your formula is not correcie because for the term $\binom{n}{y}5^{y}$ there must be a summation from $0$ to $n$ (not to $n-1$) – drhab Apr 11 '14 at 09:59
  • Yours gives for instance $m_1=0$ which is not correct. – drhab Apr 11 '14 at 10:12
  • Ahh I know get $$m_n= \frac {(\sum^{n−1}_{y=0}(m_y+1)\binom n y 5^y) + 5^n} {6^n−5^n}$$ – Shuzheng Apr 11 '14 at 10:19
  • and notice that $\sum_{y=0}^{n-1}\binom{n}{y}5^{y}=\sum_{y=0}^{n}\binom{n}{y}5^{y}-5^{n}=6^{n}-5^{n}$ leading exactly to my formula! – drhab Apr 11 '14 at 10:22
  • May I ask how I could find the distribution for $N_n$ ? How can I verify that $P(N_n \le \infty) = 1$ ? I've tried to write $P(N_n \le k)$ for some small $k \in \mathbb N$ - again, how can I see that this sum equals $1$ ? (I'm curious). – Shuzheng Apr 25 '14 at 17:34
  • I have added something. The answer of @Did was very inspiring. – drhab Apr 25 '14 at 18:25