15

In Pitman's Probability, the tail sum formula for expectation is introduced for a nonnegative (0,1,...) discrete random variable $X$:

$$E(X) = \sum_{i=0}^\infty P(X > i).$$

  1. I wonder if there is a similar formula for nonnegative continuous random variable $X$:

    $$E(X) = \int_0^\infty P(X > x) dx?$$

    If no, are there some conditions for it to hold? And how can it be proved?

    Here is my thought:

    If the cdf $F$ of $X$ is bijective, then $X=F^{-1}(U)$ for some random variable $U$ uniformly distributed over $[0,1)$. So $$E(X) = \int_0^1 F^{-1}(u) du.$$

    To prove the tail sum formula, it suffices to prove $$\int_0^1 F^{-1}(u) du = \int_0^\infty P(X > x) dx.$$ But I am stuck here.

    What's more, is the condition that the cdf $F$ of $X$ is bijective really necessary for tail sum formula to hold?

  2. Can tail sum formula be generalized to a random variable that is not necessarily nonnegative?

Thanks!

Tim
  • 47,382
  • 2
    I've seen it written as a Riemann-Stieltjes integral $\int_0^\infty x;dF(x)$, where $F(x) = \Pr(X \le x)$ is the cumulative distribution function. This works regardless of whether you have a continuous distribution, a discrete distribution, a singular distribution (which is neither continuous nor discrete nor a mixture of the two), or a mixture of any or all of the three kinds. Only for continuous distributions is this the same as $\int_0^\infty x f(x);dx$, where $f(x)=F'(x)$ is the probability density function. – Michael Hardy Sep 12 '11 at 01:26
  • @Michael: Thanks! The tail sum formula is another way to computer expectation defined as in your comment. – Tim Sep 12 '11 at 02:45
  • 1
    Similar question: http://stats.stackexchange.com/questions/18438/does-a-univariate-random-variables-mean-always-equal-the-integral-of-its-quanti/ – Henry Nov 26 '11 at 13:10
  • In case others stumble upon this question and are also using Pitman's textbook: the continuous case is covered in Exercise 9 of Section 4.5. – angryavian Aug 01 '16 at 23:45
  • https://math.stackexchange.com/q/172841/321264 – StubbornAtom Feb 09 '20 at 15:00

4 Answers4

16

This is Fubini theorem for nonnegative functions/sums. If $x$ is a nonnegative integer,

$$ x=\sum_{i=0}^{+\infty}[x\gt i]. $$

Likewise, if $x$ is a nonnegative real number,

$$ x=\int_{0}^{+\infty}[x\gt t]\mathrm dt,\qquad x=\int_{0}^{+\infty}[x\geqslant t]\mathrm dt. $$

Then one integrates both sides of the relevant identity with respect to the distribution $\mathrm P_X$ of $X$ and one uses Fubini theorem to change the order of the summation/integral and of the expectation.

For example, the second identity yields $$ \mathbb E(X)=\int_\Omega X\ \mathrm d\mathbb P=\int_\Omega \int_{0}^{+\infty}[X\gt t]\mathrm dt\ \mathrm d\mathbb P=\int_{0}^{+\infty}\int_\Omega [X\gt t]\mathrm d\mathbb P\ \mathrm dt $$ that is,

$$\mathbb E(X)=\int_{0}^{+\infty}\mathbb P(X\gt t)\ \mathrm dt. $$

Likewise, the first identity yields $$ \mathbb E(X)=\int_\Omega X\ \mathrm d\mathbb P=\int_\Omega\ \sum\limits_{i=0}^{+\infty}[X\gt i]\ \mathrm d\mathbb P=\sum\limits_{i=0}^{+\infty}\ \int_\Omega[X\gt i]\ \mathrm d\mathbb P $$ that is,

$$\mathbb E(X)=\sum\limits_{i=0}^{+\infty}\mathbb P(X\gt i). $$

Did
  • 279,727
  • What does the notation dP mean? I am not familiar with this definition of expectation. – Mong H. Ng Jan 27 '19 at 02:31
  • @MongH.Ng Really? This is rather surprising. How do you define the expectation of a random variable then? – Did Jan 27 '19 at 08:39
  • In my entire college career, I have only ever see E[X} = \int_{-\infty}^{\infty} x f(x) dx. I am guessing they are equivalent, but I am not sure how to parse this. – Mong H. Ng Jan 27 '19 at 17:31
  • @MongH.Ng Well, so you cannot compute $E(X)$ when $X$ has no PDF? Anyway, start from your formula, note that $f(x)=-\frac d{dx}P(X\geqslant x)$ and $1=\frac d{dx}x$ hence, by an integration by parts, $$\int_0^\infty xf(x)dx=\left[-xP(X\geqslant x)\right]_0^\infty-\int_0^\infty1\cdot(-P(X\geqslant x))dx$$ hence you are done. – Did Jan 27 '19 at 19:28
  • @MongH.Ng https://en.wikipedia.org/wiki/Expected_value#Alternative_formula_for_expected_value – Did Jan 27 '19 at 19:30
10

This is integration by parts. Take $X\geq 0$, and call its distribution function $F$. Let $g$ be an increasing differentiable function with $g(0)=0$.

$$\begin{align*} \mathbb{E}[g(X)]&=\int_0^\infty g(t) dF(t) \\ &= \int_0^\infty -g(t) d(1-F(t)) \\ &= [-g(t)(1-F(t))]^\infty_0 - \int_0^\infty 1-F(t)d(-g(t)) \\ &= \int_0^\infty g'(t)\mathbb{P}[X>t]dt \end{align*}$$

This reduces to what you want when $g(X)=X$.

One way we could compute $\mathbb{E}[X]$ for general $X$ would be to compute $\mathbb{E}[X^+]$ and $\mathbb{E}[X^-]$ in this way and then take the difference, where $X^+=\max(X,0)$ and $X^-=\max(-X,0)$.

Justification of integration by parts: The integration by parts works when $g(X)$ is integrable, since, by dominated convergence, $$\begin{align} \limsup_x g(x)\mathbb{P}[X>x] &\le \limsup_x\mathbb{E}[g(X)\mathbf{1}(X>x)]\\ &= \mathbb{E}[\limsup_x g(X)\mathbf{1}(X>x)]\\ &= 0. \end{align}$$

Ben Derrett
  • 4,592
  • Why is $g(+\infty)(1-F(+\infty))=0$? Ok, $1-F(+\infty)=0$ but what if $g(t)\to +\infty$? Does $g$ have to be bounded? Or does $1-F(t)$ converge so fast to $0$, that this works for any $g$? Thanks. – Jimmy R. Jan 04 '16 at 14:24
  • @Stef I've edited the answer in response to your question. – Ben Derrett Jan 04 '16 at 20:42
  • Ok, thanks. Yes, of course, if $g$ is integrable this works fine. – Jimmy R. Jan 04 '16 at 21:09
4

First we prove that for $X \geq 0$ $$EX= \int_{0}^{\infty}P(X > t)dt.$$ We apply Fubini's theorem. Our product measure will be a product of the distribution of $X$ and Lebesgue measure. So $$\int_{0}^{\infty}P(X>t)dt = \int_0^{\infty}\left( \int_{t}^{\infty} \nu_X(ds)\right)dt=\int_{\mathbb{R}}\left( \int_{0}^{\infty}\textbf{1}_{[0,\infty)}(t)\textbf{1}_{(t,\infty)}(s)\nu_X(ds)\right)dt=$$ $$=\int_{0}^{\infty}\left(\int_0^s dt \right) \nu_X(ds)= \int_{0}^{\infty}s \nu_X(ds)=\int_{\mathbb{R}}s\nu_X(ds)=EX$$

But, you want to have $EX=\int_{0}^{\infty}P(X \geq t)dt.$

Applying what we have already proved, it is enough to show that $\int_{0}^{\infty}P(X=t)dt=0$.

  • 1
    We can also show that for $X \geq 0$ and for an increasing differentiable function $g$, such that $g(0)=0$ we have $$E(g(X))= \int_{0}^{\infty} g^{\prime}(t) P(X>t) dt.$$ – Edvin Goey Sep 12 '11 at 01:16
  • Thanks! Don't worry about $EX=\int_{0}^{\infty}P(X \geq t)dt$. I misunderstood Pitman's book for the discrete case. – Tim Sep 12 '11 at 01:31
  • No problem, but $EX= \int_{0}^{\infty}P(X \geq t)dt$. I proved the main part for this, to get $\geq$ instead of $>$ it is sufficient to show that $\int_0^{\infty}P(X=t)dt=0$ which is obvious. – Edvin Goey Sep 12 '11 at 01:35
  • 3
    Edvin: This is obvious when using the right approach but maybe not so obvious to every MSE reader, hence I wonder why you skip the proof of this part... – Did Nov 25 '11 at 17:28
3

@Did gave a great answer, and while I'm thinking about it I want to record a similar answer using slightly different notation in the case where the cumulative density function $F$ is differentiable.

In this case, \begin{align} \mathbb E(X) &= \int_0^\infty x F'(x) \, dx \\ &= \int_0^\infty \int_0^\infty [x \geq t] \, dt \,F'(x) \, dx \\ &= \int_0^\infty \int_0^\infty [x \geq t] F'(x) \, dt \, dx \\ &= \int_0^\infty \int_0^\infty [x \geq t] F'(x) \, dx \, dt \quad \text{(Fubini's theorem)}\\ &= \int_0^\infty \int_t^\infty F'(x) \, dx \, dt \\ &= \int_0^\infty 1 - F(t) \, dt. \end{align}

(The quantity $[x \geq t]$ is equal to $1$ if $x \geq t$ and $0$ otherwise.)

littleO
  • 51,938