2

I'm aware that this is in regards to the same question set as Find the Mean for Non-Negative Integer-Valued Random Variable and Expected value equals sum of probabilities

my issue is understanding and unfortantly either of those two help my understanding.

Let T be a non-negative integer-valued random variable with $E(T) < \infty$. prove that $$E(T) = \sum_{k=1}^{\infty} P(T \geq k)$$

so my understanding of this question is shocking infact it wasn't untill i had read the above that i relised $P(T \geq k) = \sum_{i=k}^{\infty}P(T=i)$ and why this made sense. (completely slipped my mind)

i've tried to follow one of the aboves answers and i need to make sure my understanding of whats going on is concrete and correct.

from my understanding the key to this is remembering that infinite double series can be written as a single series which is going to be the expectation. in all honesty its the reworking of the lower and upper limits which has me concerned.

we start with $$E(T) = \sum_{k=1}^{\infty} P(T \geq k) = \sum_{k=1}^{\infty}\sum_{i=k}^{\infty}P(T=i)= \sum_{k=0}^{\infty}\sum_{i=k}^{\infty}P(T=i)$$

this is just straight substitution of the above identity and shifting the lower index by 1 which if i remember rightly we can do because this is a convergent series. from reading the above the next step i believe is.

$$\sum_{k=0}^{\infty}\sum_{i=k}^{\infty}P(T=i) = \sum_{i=0}^{\infty}\sum_{k=0}^{i}P(T=i)$$ which looks like a rearrangement of the series. i will tend to infinity anyway so this looks like we're just taking particular values at a time in order to end up with i inside the summation. but this is the step in which im unsure about and so any explination would be great. from here it looks like

$$\sum_{i=0}^{\infty}\sum_{k=0}^{i}P(T=i) = \sum_{i=0}^{\infty} iP(T=i) = E[T]$$

which answers the set question but not my question as to why this is allowed.

again any and all help would be great. thanks in advance.

johnnyB
  • 35
  • On your first line of equations, I don't think $\displaystyle \sum_{k=1}^{\infty}\sum_{i=k}^{\infty}P(T=i)= \sum_{k=0}^{\infty}\sum_{i=k}^{\infty}P(T=i)$ is correct. If you subtract the LHS from the RHS, you get $\displaystyle \sum_{i=0}^{\infty}P(T=i)$, which equals one, not zero. –  Oct 04 '17 at 17:07
  • Similarly, on your last line of equations, $\displaystyle \sum_{i=0}^{\infty}\sum_{k=0}^{i}P(T=i)$ would equal $\displaystyle \sum_{i=0}^{\infty}(i+1)P(T=i)$, not $\displaystyle \sum_{i=0}^{\infty}iP(T=i)$, because there are $i+1$ terms in the sum $\displaystyle \sum_{k=0}^{i}$. –  Oct 04 '17 at 17:13
  • This comment is to link this post as one of the (abstract) duplicates to the current choice of mother post. – Lee David Chung Lin Nov 13 '18 at 13:26

4 Answers4

2

$E(T)=\sum\limits_{k=1}^\infty kP(T=k)=\sum\limits_{k=1}^\infty\sum\limits_{i=1}^k P(T=k)=\sum\limits_{k=1}^\infty\sum\limits_{i=1}^\infty I(i,k)P(T=k)$

Where $I(i,k)$ is $1$ if $i\leq k$ and $0$ otherwise, since the summands are positive we can "commute" the sum ( we can apply the dominated convergence theorem) to get:

$\sum\limits_{i=1}^\infty \sum\limits_{k=1}^\infty I(i,k)P(T=k)=\sum\limits_{i=1}^\infty \sum\limits_{k=1}^i P(T=k)=\sum\limits_{i=1}^\infty P(T\leq i)$

Asinomás
  • 105,651
2

In the step $$ \sum_{k=1}^{\infty}\sum_{i=k}^{\infty}P(T=i)= \sum_{k=0}^{\infty}\sum_{i=k}^{\infty}P(T=i) $$ the shifting of the lower index by $1$ is not allowed, and ends up incorrectly adding an additional $1$ to the sum, and you won't end up with $E(T)$. If you don't shift the lower index, you can continue to rearrange the order of summation as before, and your final sum is $$ \sum_{i=1}^{\infty}\sum_{k=1}^{i}P(T=i) = \sum_{i=1}^{\infty} iP(T=i) $$ At this point you can shift the lower index to $i=0$ because you're adding zero to the sum. The result will then be $$\sum_{i=1}^{\infty} iP(T=i)=\sum_{i=0}^{\infty} iP(T=i)=E(T).$$

grand_chat
  • 38,951
2

Another explanation of this. Let $X$ be a non-negative integer valued random variable, as in the question, and let us play a two-player game. I draw X, and then you ask me the following questions.

  1. Is $X \ge 1$. I say YES or NO.
  2. Is $X \ge 2$. I say YES or NO.
  3. Is $X \ge 3$. I say YES or NO.
  4. ....

    (and so on)

Note that if $X = 3$, I say YES 3 times. In general, if $X = i$, I say YES $i$ times. Thus we realize the following $$X = I\{X \ge 1\} + I\{X \ge 2\} + I\{X \ge 3\} + ... , $$ where $I\{\cdot\}$ is the indicator function. Apply expectation on both sides. You get $$\mathbb{E}[X] = \mathbb{P}[X \ge 1] + \mathbb{P}[X \ge 2] + \mathbb{P}[X \ge 3] + ...$$.

1

What really helps me is to see that

$$ E[X] = \sum_{k=1}^\infty P(X \geq k) = \sum_{k=1}^\infty [P(X = k) +P(X = k+1) + \cdots] = \sum_{k=1}^\infty\sum_{j=k}^\infty P(X = j). $$

Now, listing the values that $j$ can take for every value of $k$ (you can think the vertical columns here represent the inner summation)

$k=1$, $j$ = 1 2 3 4 ...

$k=2$, $j$ = x 2 3 4 ..

$k=3$, $j$ = x x 3 4 ..

so you can se how in the sum the $P(X=1)$ appears once, $P(X=2)$ appears twice and the generic $P(X=j)$ has to appear $j$ times, which justifies the rewriting as

$$ \sum_{k=1}^\infty\sum_{j=k}^\infty P(X = j)=\sum_{j=1}^\infty\sum_{k=1}^j P(X = j)=\sum_{j=1}^\infty j P(X = j) = E[X]. $$