0

How can I prove $$E(X) =\int_{0}^{\infty} [1 - F_X(x)] dx$$ for a continuous variable X and for a discrete variable X the following holds

$$E(X) = \sum_{k=0}^{\infty} [1 - F_X(k)]$$

My idea is to use the fact that $$E(X) = \int_{-\infty}^{\infty} x f_X(x) dx $$ but then how can I continue? Do I need to "break the integral" into 2?

2 Answers2

1

Hint

You need $X$ positive almost surely. For the continuous case, you have that $$1-F_X(x)=1-\mathbb P\{X\leq x\}=\mathbb P\{X>x\}=\int_x^\infty f_X(y)\,\mathrm d y.$$

Then, using Fubini allows you to conclude. The discrete case works exactly the same.

Surb
  • 55,662
0

The first equality is true for nonnegative $X$. More generally, for any integrable $X$ one has $$ EX = \int _0^\infty \mathbb P\{X>x\}dx - \int _{-\infty}^0 \mathbb P\{X\leqslant x\}dx. $$


Proof for $X\geqslant 0$ a.s. using standard machine:

  1. Prove equality for nonnegative simple functions.
  2. Recall that any nonnegative measurable function is a pointwise monotone limit of simple functions.
  3. Apply monotone convergence theorem.

If $X\in\mathcal L_1$, then write $X=X^+-X^-$ and use what you proved previously.

AlvinL
  • 8,664