14

I have a fair coin. What is the expected number of tosses to get three Heads in a row?

I have looked at similar past questions such as Expected Number of Coin Tosses to Get Five Consecutive Heads but I find the proof there is at the intuitive, not at the rigorous level there: the use of the "recursive" element is not justified. The Expectation $\mathbb E[X]$ is a number, not a random variable, as it is treated there. Please make this clear.

RandomGuy
  • 1,387
  • that is precisely one of those not completely clear kind of proofs I was referring to. – RandomGuy Jun 25 '16 at 18:22
  • I understand how to apply the argument there to the case $n=3$, that is trivial... :D what is unclear to me is the justification of this "taking the expectation recursively" that I found unclear and I want to see a rigorous justification of this step. That in the solution you linked is not given. – RandomGuy Jun 25 '16 at 18:26
  • I think you didn't understand my point – RandomGuy Jun 25 '16 at 18:31

2 Answers2

38

Consider the outomes $T,HT,HHT,HHH$. All but the last put you back to the start. So if $x$ is the expected number of tosses to get HHH we have $$x=\frac{1}{2}(x+1)+\frac{1}{4}(x+2)+\frac{1}{8}(x+3)+\frac{1}{8}3\ \ (*)$$ That is easy to solve for $x$ giving $$x=14$$

---------- Added 26 June 2016 ----------

Now let us consider this solution more carefully. Note first that the events $T,HT,HHT,HHH$ are disjoint and exhaustive. They have probabilities $\frac{1}{2},\frac{1}{4},\frac{1}{8},\frac{1}{8}$ respectively.

Let $3H$ be the random variable "a number of tosses until the first run of three $H$ is achieved". Now $(*)$ states: $$E(3H)=E(3H|T)p(T)+E(3H|HT)p(HT)+E(3H|HHT)p(HHT)+E(3H|HHH)p(HHH)$$ This is sometimes known as "computing expectations by conditioning" (see, for example, Sheldon Ross, Introduction to Probability Models 3.4 p100). It is often written more concisely as $$E(E(X|Y))=E(X)$$ where the outer expectation on the LHS is $E_Y(\cdot)$.

---Added 5-2-2022---

The expected number of tosses to obtain three consecutive heads given that the first toss is a tail equals one plus the expected number of tosses to obtain three consecutive heads (starting from that point). $$\mathsf E(3H\mid T) = 1+\mathsf E(3H)$$

This is the $(1+x)$ factor in the above. So too are the other terms evaluated.


It follows directly from the definitions of conditional probability and expectation. The discrete case is particularly straightforward.

It just comes down to what is sometimes called the "partition theorem": if $B_n$ is a partition of the sample space, then $E(X)=\sum_nE(X|B_n)p(B_n)$. Note that we have $$E(X|Y=y)=\sum_xxp(X=x|Y=y)=\sum_xx\frac{p(X=x\cap Y=y)}{p(Y=y)}$$ where the last equality is just the definition $$p(A|B)=\frac{p(A\cap B)}{p(B)}$$ Having written all that, I see that Wikipedia calls it the "Law of total expectation" and has an excellent article on it.

Graham Kemp
  • 129,094
almagest
  • 18,380
  • 4
    I know this isn't quite related, but congrats on 10k reputation! – Noble Mushtak Jun 25 '16 at 18:24
  • 1
    Thanks! I am now absolutely mystified as to how people have managed to get to 200k in the few years MSE has been around! – almagest Jun 25 '16 at 18:25
  • Please read my comment above. Yes, I understand how to apply that method to this situation. The unclear thing is the justification of the recursive relation on the expectation. That in the link is not provided. – RandomGuy Jun 25 '16 at 18:28
  • I will try to better explain myself. I understand your argument, but this is an intuitive explanation as long as you are not using properties of the expectation which allow you to PROVE that you can indeed use the recursive equation. THat is my point: intuitively is clear what you say, but it is not a rigorous proof as long as you show that it depends on the properties of the expectation of the random variable considered here. – RandomGuy Jun 25 '16 at 18:34
  • @RandomGuy How rigorous do you want it? You can take this answer and phrase it in terms of conditioning on the time of the first T (toss 1, 2, 3, or >3) and then apply the law of total probability. You will need to invoke the iid-ness of the RVs but I don't see why that is a problem: "it is not a rigorous proof as long as you show that it depends on the properties of the expectation of the random variable considered here." – snarfblaat Jun 25 '16 at 19:26
  • Thanks. So you see my question was not at all trivial after all. EVERY STEP IN A PROOF MUST BE RIGOROUSLY JUSTIFIED. The point that I was making and that many here did not understand is that in the argument of the linked question you cannot used the expectation as if was a random variable: Expectation is a number, not a function. So I was definitely right to say that the argument there was at the intuitive level, not the rigorous one. PS I also suspected that the conditional expectation was the key here. – RandomGuy Jun 26 '16 at 09:24
  • @RandomGuy Thanks. I would only add that "intuitive" is not really the correct term. Any working mathematician would regard my original argument as rigorous, necause filling in the detailed steps is essentially trivial. Few people except logicians are interested in the details once they have seen them a few times. – almagest Jun 26 '16 at 10:38
  • @RandomGuy I guess I should add that one of the delights and drawbacks of maths is that things tend to seem impossibly difficult until they seem trivial :) – almagest Jun 26 '16 at 11:01
  • @alnagest Absolutely not. The use of the conditional expectation is one thing, the expectation is another. Using one istead of another is just source of confusion. There is a theorem to use in order to make sure that the equation you wrote is correct. Mathematicians of not very high level typically do not understand the difference between a rigorous proof and an intuitive argument and don't really understand why in Mathematics every single step must be justified. You should read what A. Grothendieck wrote on this topic. – RandomGuy Jun 26 '16 at 14:10
  • @RandomGuy Can you give me a reference for Grothendieck? I just read this TaoBlog – almagest Jul 04 '16 at 09:35
  • @almagest with great pleasure. There's tons of material about AG's mathematical methodology, the very best should be to read his own autobiography, Recoltés et Semailles, but is hardly available. Can you read Italian? If so, in my opinion the best article about AG's immense progress in Mathematical methodology is Luca Barbieri Viale's paper: http://users.unimi.it/barbieri/grothendieck.pdf . – RandomGuy Jul 04 '16 at 09:52
  • @almagest You can also find a lot of very good papers on the subject in the Grothendieck circle: http://www.grothendieckcircle.org/ : visit the section "Surveys and overviews of Grothendieck's work" and the papers of Deligne (student of AG, Fields Medal 1978) and Cartier. This is a very good website dedicated to the 20th Century's greatest mathematician. – RandomGuy Jul 04 '16 at 09:53
  • @RandomGuy Many thanks for the refs, which I will follow up. I am an admirer of Grothendieck, but I never learnt that approach whilst I was young enough to find it easy/ier ... I tend to stick to analysis and combinatorics these days :( BTW my Italian is limited - I am better at Latin and much prefer English! Must make an effort to get this Q reopened sometime. It is not as difficult as is made out ... – almagest Jul 04 '16 at 09:57
  • Grothendieck really "said something new" in Mathematics. His work is immensly important because apparently for the first time such a profound, methodological approach in Mathematics was so ripe of great results. His importance is really similar to the importance of Einstein in modern Physics, but in some sense is even more important, because he introduced really new ways of thinking, that up to that time were kind of confined to the background of the mathematicians' minds. – RandomGuy Jul 04 '16 at 10:04
  • Einstein is currently a menace. His gravity theory his wrong in the weak field domain, but no one can make themselves believe it. Can you help with the ABC guy - I am finding his stuff really hard to folliow. All the pros have apparently gone on strike! – almagest Jul 04 '16 at 10:47
  • @almagest sorry, I didn't understand your last comment. Who is the ABC guy? And what does the pros have gone on strike means? – RandomGuy Jul 05 '16 at 09:36
  • The ABC conjecture is one of the most important problems in maths. It is hugely important. Shinichi Mochizuki has published a proof. It is roughly 5000 pages long. Almost everyone in the field has balked at reading it! I am outraged. These guys are supported on public funds, they have no right to run away pleading not enough theorem credit is in it for them. – almagest Jul 05 '16 at 14:39
  • 3
    Why is TT not considered? – Jaydev Jul 24 '17 at 01:22
  • @almagest Why is TT not considered? Why is that list exhaustive? Which space is it partitioning? – Kerry Sep 19 '17 at 19:22
  • @Jaydev. $\rm TT$ is the event of "throwing a tail, starting over and throwing a tail, and starting over again". It is a subset of "throwing a tail, and starting over". The partition is over the tosses before starting over, or the case of immediately terminating. $\rm{{T{...}HHH}, {HT{...}HHH}, {HHT{...}HHH}, {HHH}}$ – Graham Kemp Feb 03 '22 at 02:58
22

Although the question has already been answered, I would like to offer a very similar solution but a different approach mindset to it.

enter image description here

It is a crude image but it essentially explains the answer above very beautifully.

At the beginning, we have no coins tossed so we have no consecutive heads. Next, we toss an $H$ or $T$ with probability $\frac{1}{2}$. Thus, we go the next state $2$ with probability $\frac{1}{2}$, similarly with $3$ and $4$.

Let $g(x)$ be the expected time until we reach state $4$, $HHH$, from state $x\in\{1,2,3,4\}$. Obviously $g(4)=0$ since we are already at state $4$!

\begin{align} g(1)&=\frac{1}{2}(g(2)+g(1))+1\\ g(2)&=\frac{1}{2}(g(3)+g(1))+1\\ g(3)&=\frac{1}{2}(g(4)+g(1))+1\\ \end{align}

Since whenever we move from one state to another we take or "waste" one step. However, we only take a step in the correct direction with probability $\frac{1}{2}$. Otherwise, we have to go back to state $1$ and need to find $g(1)$.

Thus, by substitution (or recursion), $$g(1)=1+\frac{1}{2}g(1)+\frac{1}{2}\left(\frac{1}{2}g(3)+\frac{1}{2}g(1)+1\right)=1+\frac{1}{2}g(1)+\frac{1}{4}g(1)+\frac{1}{2}+\frac{1}{4}\left(\frac{1}{2}g(1)+1\right)=1+\frac{1}{2}+\frac{1}{4}+g(1)\left(\frac{1}{2}+\frac{1}{4}+\frac{1}{8}\right)$$

$$g(1)=14$$

Jake
  • 348