2

Suppose $G$ is an increasing, right-continuous function such that $G(x)\ge 0$ for all $x\ge 0$ and $G(0)=0$. If $X$ is an arbitrary non-negative random variable, then I am trying to show that

$$E\left[G(X)\right]=\int_0^\infty P(X\ge t)\,dG(t) \tag{1}$$

Since $G(X)\ge 0$ a.e., comparing with this popular question, I think I should have

$$E\left[G(X)\right]=\int_0^\infty P(G(X)>t)\,dt \tag{2}$$

But does $(2)$ reduce to $(1)$ from a change of variables?

Starting from scatch, if $F$ is the distribution function of $X$, then I have $$E\left[G(X)\right]=\int_0^\infty G(x)\,dF(x)=\int_0^\infty \left\{\int_0^{G(x)}\,dy\right\}dF(x)$$

Here I thought of using Fubini's theorem but I am not sure how to proceed with that.

Does $G^{-1}$ exist? If $G^{-}$ is some sort of generalized inverse of $G$, then I could perhaps say $$0< y< G(x),0<x<\infty\implies G^{-}(y)< x<\infty,0<y<G(\infty)$$

Using integration by parts I think it is true that $$\int_0^\infty G(x)\,dF(x)=\int_0^\infty P(X\ge t)\,dG(t)$$

But then I am not sure how the conditions on $G$ come into play. Any hint would be great.

StubbornAtom
  • 17,052

1 Answers1

2

This is a simple application of Fubini/Tonelli's Theorem: $\int_0^{\infty} P(X \geq t)dG(t)=\int_0^{\infty}\int I_{t \leq X} dPdG(t)= \int \int_0^{\infty} I_{t \leq X} dG(t)dP=\int G(X) dP=E[G(X)]$.

  • Okay I should have used $dG(t)$ instead of $dy$ as my inner integral. I can see that $G\ge 0$ is used. But does right-continuity and increasing nature of $G$ come into play here? – StubbornAtom Nov 27 '20 at 09:56
  • 1
    There is a unique measure $\mu$ such that $\mu ([0,x])=G(x)$ for all $x \geq 0$ and $\int fdG$ is interpreted as integration with respect to $\mu$. You need right continuity and non-negativity to construct $\mu$. @StubbornAtom – Kavi Rama Murthy Nov 27 '20 at 10:00