2

For a geometric random variable X, $P_X(k) = (1-p)^{k-1}p$, then $$E[X] = \sum_{k=1}^{\infty}k(1-p)^{k-1}p\;.$$ That is ($q = 1 -p$) $$E[X] = p + 2qp + 3q^2p + 4q^3p + \dotso $$

But $$P_{X|X>1}(k) = \begin{cases}(1-p)^{k-2}p\text{ if }k > 1\\ 0\text{ if }k = 1\end{cases}$$ $$E[X|X>1] = \sum_{k=2}^{\infty}k(1-p)^{k-2}p\;,$$ that is $$E[X|X>1] + 1 = 1 + 2p + 3qp + 4q^2p + \dotso$$

So how could $E[X|X>1] = E[X] + 1$?

Thanks.

joriki
  • 238,052
Jichao
  • 8,008
  • I fixed some things in the formatting. Punctuation after a displayed equation should go inside the dollar signs; else it will appear on the beginning of the next line. And ellipsis (...) in math mode is best produced using \dotso (after/between operators) or \dotsc (after/between commas). – joriki Aug 30 '11 at 05:45
  • We have $E(X)=p+2qp+3q^2p+\cdots$. Also, $E(X|X>1)=2p+3qp+4q^2p+\cdots$. Both of these use your calculations. Find the second minus the first. You will get an infinite geometric series that has sum $1$. (There are more conceptual ways of seeing the answer, without any calculation.) – André Nicolas Aug 30 '11 at 05:51
  • 4
    This is the memoryless property of geometric (and exponential) random variables, and can be extended to $E[X|X>n] = E[X|X>0] + n$ for suitable $n$ – Henry Aug 30 '11 at 07:05
  • Like @Henry said. Furthermore, for every function $u$ and every nonnegative $n$, $E(u(X)\mid X>n)=E(u(X+n))$. – Did Sep 09 '11 at 13:35

3 Answers3

6

Since $P_X(k)$ is normalized, we have

$$\sum_{k=1}^\infty(1-p)^{k-1}p=\sum_{k=2}^\infty(1-p)^{k-2}p=1\;.$$

Thus

$$ \begin{eqnarray} E[X|X>1] &=& \sum_{k=2}^{\infty}k(1-p)^{k-2}p \\ &=& \sum_{k=2}^{\infty}(k-1)(1-p)^{k-2}p+\sum_{k=2}^{\infty}(1-p)^{k-2}p \\ &=& \sum_{k=1}^{\infty}k(1-p)^{k-1}p+1 \\ &=& E[X] + 1 \end{eqnarray} $$

All this is really saying is that since the conditional probability for $k+1$ is the same as the unconditional probability for $k$, the conditional expectation value of $k$ must be the unconditional expectation value of $k+1$.

joriki
  • 238,052
5

A coin has probability $p$ of landing heads, and $q=1-p$ of landing tails. Assume that $p\ne 0$.

Let $X$ be the total number of tosses until you get a head. Then $X$ has precisely the geometric distribution that you described. One can, as you did, get an expression for $E(X)$ as an infinite series. In fact, it turns out $E(X)=1/p$. But we need neither the series nor its sum to prove the result that is asked for.

Suppose that we are given that $X>1$. This means that our first toss was a tail. Let $Y$ be the additional number of tosses that we must wait for a head. The coin does not remember that the first toss was a tail, so $Y$ has the same distribution, and therefore the same mean, as $X$. In symbols, $E(Y)=E(X)$.

But the total number of tosses, given that $X>1$, is $1+Y$. The $1$ is for the "wasted" first toss. Thus $$E(X|X>1)=E(1+Y)=1+E(Y)=1+E(X).$$

Comment: If you prove the result using the infinite series, you know that the result is true. If you do it more conceptually, you know why the result is true.

André Nicolas
  • 507,029
  • Great answer! which in addition gives $E[X]$ without needing to sum a series, for we have that $E[X\mid X = 1] = 1$, $E[X \mid X > 1] = 1 + E[X]$, and since the events ${X = 1}$ and ${X > 1}$ have probabilities $p$ and $1-p$, we get $E[X] = p + (1 + E[X])(1-p) = 1 + (1-p)E[X]$, that is, $E[X] = 1/p$. – Dilip Sarwate Sep 24 '11 at 02:12
0

The "memorylessness" of the geometric distribution implies that the conditional probability distribution of $X$ given that $X\ge\text{any particular integer}$ is the same as the probability distribution of $X+\text{that same integer}$.

  • You could have a look at @Henry's comment to the main question. (Additionally, what you write is false since $X\ge1$ almost surely.) – Did Aug 30 '11 at 14:24
  • Alright, you're starting at 1 rather than 0. So $X+\text{that same integer} - 1$. – Michael Hardy Aug 30 '11 at 17:21
  • Me? I am not doing anything except reading what the OP wrote. (But what about the first sentence of my previous comment?) – Did Aug 30 '11 at 17:25