0

I know this question has been asked several times here before, but I think I have a different view and that's why I wanted some help with it.

Let me first state a few numbers here:

E(coin tosses until $HTH$ appears) = $10$

E(coin tosses until $HHT$ appears) = $8$

E(coin tosses until $HT$ appears) = $4$

E(coin tosses until $HH$ appears) = $6$

E(coin tosses until $H$ appears) = $2$

E(coin tosses until $T$ appears) = $2$

What I am upto: I know given any such pattern, I can construct a Markov chain with the absorbing state being the pattern sough. Then the waiting time for the absorbing state will give me the expectation of the number of coin tosses.

But I thought if I can build up these expectations recursively. More precisely, can I express expectation of a pattern of length $n$ in terms of expectation of patterns of length $k$ $(k < n)$?

So to attack this problem, I first thought of how to express $HTH$ in terms of $HT + H$. I don't know how to present the following argument formally, but here's a very vague explanation:

In the first 4 tosses, I expect to see a $HT$. If following $HT$ was a $H$, then I have won ($HTH$), but I could fail for the first time and get a $T$.

Now in the next following 4 coin tosses again, I expect to see a $HT$. This time, I expect to see a $H$, following the $HT$ coin toss.

Hence I expect to see $HTH$ in 10 coin tosses.

Now, if I am apply similar kind of reasoning to $HHT$ case, I would end up getting something like this:

I expect to see $HH$ in the first 6 coin tosses. And the first time, I could fail to see a following $T$. So in the next 6 tosses I would again see a $HH$ and this time I would expect to win. Hence, total of 14 coin tosses to win.

But clearly, that line of reasoning turns out to be wrong.

Is there some way I can condition on previous tosses and get the required expectation?

kishlaya
  • 1,317
  • " So in the next 6 tosses I would again see a HH and this time I would expect to win." This line of reasoning is incorrect. We are no more likely to "win" with the second occurrence of HH than with the first occurrence. – Math1000 Oct 11 '19 at 14:48
  • @Math1000 Why is that so? – kishlaya Oct 11 '19 at 17:47
  • Because the coin tosses are independent. Every time we see HH, there is the same chance of seeing T in the next toss. – Math1000 Oct 11 '19 at 17:48
  • @Math1000 Agreed. But on average, I would expect to see H next time if a T occurred for the first time, isn't it? – kishlaya Oct 13 '19 at 05:36
  • No, because the coin tosses are independent. – Math1000 Oct 13 '19 at 16:14
  • Ahh ok. But otherwise, do you have any suggestions for the main question i.e. if I can compute these expectations in a better way (other than using a new Markov model every time)? – kishlaya Oct 13 '19 at 17:11

0 Answers0