3

Suppose a computer can generate either 0's or 1's, and I want to get two consecutive 1's. The computer generates 0's with probability of $1/2$ and 1's with probability of $1/2$. Let's call the number of times required to get two consecutive 1's $K$. I need to find the moment generating function of $M$. I know that $M_K(t)=E[e^{tK}]$, and I'm thinking that if I can express $K$ in terms of each individual trial, I would be able to find the mgf. I know each individual trial is an independent Bernoulli. I think that if I call $X_i$ the outcome of each trial, then $K=X_1+X_2+\cdots+X_n$ where $n$ is the number of trials until the sequence 1, 1 occurs. Is this correct?

mmm
  • 1,849
  • You can think of the process as a Markov chain with three states (0, 1 and 11) where 11 is an absorbing state. The time $K$ is the time till absorption into 11. Just consider the fact that before reaching 11, what happens next depends on whether your current observation is 0 or 1. – passerby51 Oct 21 '17 at 17:11
  • The hint given above is fully correct, but it is to note that by putting an absorbing barrier at "$11$" , the relevant counting will provide the Cumulative Probability over $n$, that is the probability that a string of length $n$ contains at least one appearence of the substring "$11$". – G Cab Oct 25 '17 at 21:26
  • I believe you meant the moment generating function of $K$? – Math1000 Oct 26 '17 at 01:31
  • Also, your last sentence is redundant and confusing because it defines $n$ as what $K$ already is (and the equality $K=X_1+X_2+\cdots+X_n$ does not hold). – Math1000 Oct 26 '17 at 01:35

1 Answers1

0

Let $\tau_0=\inf\{n>0: X_n=X_{n-1}=1\}$ (the quantity we are interested in) and $\tau_1=\inf\{n>0: X_n=1\mid X_{n-1}=1\}$. Then \begin{align} \mathbb E[\tau_0] &= 1 + (1-p)\mathbb E[\tau_0] + p\mathbb E[\tau_1]\\ \mathbb E[\tau_01 &= 1 + (1-p)\mathbb E[\tau_0],\\ \end{align} and hence $$ \mathbb E[\tau_0] = \frac{1+p}{p^2},\quad \mathbb E[\tau_1] = \frac1{p^2}. $$

Math1000
  • 36,983