0

I have seen this definition of expectation of a nonnegative random variable in many places. I do not quite understand how it is gotten.

$$E[X]=\int_0^{\infty}P(X>t) dt $$

I know $E[X]=\int_0^{\infty}x f(x) dx $

This is used to prove the basic version of markov inequality

  • The easiest way to get that equality tho, which is likely shown on prior stackexchange pages, is Fubini-Tonelli $$X = \int_{0}^{\infty} 1_{{X >t}}dt \implies E[X] = \int_0^{\infty} E[1_{{X>t}}]dt$$ where $1_{{X>t}}$ is an indicator function that is $1$ if ${X>t}$, and $0$ else. – Michael Jun 19 '21 at 03:39
  • Another way to get there using your formula $E[X] = \int_0^{\infty} xf_X(x)dx$ is to use integration by parts with $u=-x, du=-dx$, $dv = -f_X(x)dx$, $v=1-F_X(x)$. – Michael Jun 19 '21 at 03:45
  • This is a consequence of Fubini's theorem – AspiringMat Jun 19 '21 at 03:59
  • From now on please see 'Related' on the right pane. . – Kavi Rama Murthy Jun 19 '21 at 04:54

0 Answers0