1

I'm reading Chapter 2 in Introduction to Stochastic Processes by Erhan Çinlar and I cannot understand their proof for Theorem 1.9. Before giving the theorem, they give the definition

Def. 1.2: The expected value of a discrete random variable $X$ taking values in the set $E \subset \mathbb{R}_+$ is

$$E[X] = \sum_{a\in E}aP\{X=a\}$$

Theorem 1.9 states that for any non-negative random variable $X$,

$$E[X] = \int_0^\infty P\{X > t\}dt$$

Proof: First suppose $X$ is discrete with values in $E$. Then using Definition (1.2) and changing the order of the summation and integration, we get

$$E[X]=\sum_{a\in E}aP\{X=a\}$$ $$=\sum_{a\in E} \int_0^adtP\{X=a\}$$ $$=\int_0^\infty dt \sum_{a>t}P\{X=a\} = \int_0^\infty P\{X>t\}dt$$

I just can't understand the third line. Why are we able to switch the summation sign and the integral sign? Why is the third line equivalent to the second line?

  • 1
    The author doesn’t appear to justify it themselves but there are common “regularity” theorems that justify the interchange of the order of summation and integration for many situations/functions. Here is a question asked here about interchanging the operations https://math.stackexchange.com/questions/83721/when-can-a-sum-and-integral-be-interchanged – Nap D. Lover Nov 02 '19 at 22:22
  • Thank you so much! – KureKotake Nov 02 '19 at 22:46
  • I agree that the author does not explain this proof very well. This question has been asked many times on math.stackexchange though so I'm sure with a quick search you could find a better proof. – Math1000 Nov 03 '19 at 00:15

0 Answers0