-1

I came across the following question:

If $ X $ is a continuous random variable $ (-\infty<X<\infty) $ having distribution function $ F(x) $, show that $$ E(X)=\int_{0}^{\infty}[1-F(x)-F(-x)]\,\mathrm{d}x, $$ provided $ x[1-F(x)-F(-x)]\to0 $ as $ x\to\infty $.

I went through a similar problem posted here, but cannot seem to generalise the result as required. Any hint will be appreciated.

1 Answers1

1

The key is to write $X = X^+ - X^-$ where $X^+ = \max(X, 0)$ and $X^- = \max(-X, 0)$. Now notice that for $x > 0$ we have $P(X^+ > x) = P(X > x)$ and $P(X^- > x) = P(X < -x)$. Now apply the formula of the linked answer separately for $X^+$ and $X^-$ to get the required expression. Notice that the condition $x[1-F(x)-F(-x)] \to 0$ as $x\to \infty$ was not needed.

Alternatively, if you assume $X$ has a density, you can write $$\int_0^\infty (1-F(x)-F(-x))\,dx = \int_0^\infty x'(1-F(x)-F(-x))\,dx$$ and integrate by parts to get $E(X)$ but then you would need to use the additional condition.