Suppose $G$ is an increasing, right-continuous function such that $G(x)\ge 0$ for all $x\ge 0$ and $G(0)=0$. If $X$ is an arbitrary non-negative random variable, then I am trying to show that
$$E\left[G(X)\right]=\int_0^\infty P(X\ge t)\,dG(t) \tag{1}$$
Since $G(X)\ge 0$ a.e., comparing with this popular question, I think I should have
$$E\left[G(X)\right]=\int_0^\infty P(G(X)>t)\,dt \tag{2}$$
But does $(2)$ reduce to $(1)$ from a change of variables?
Starting from scatch, if $F$ is the distribution function of $X$, then I have $$E\left[G(X)\right]=\int_0^\infty G(x)\,dF(x)=\int_0^\infty \left\{\int_0^{G(x)}\,dy\right\}dF(x)$$
Here I thought of using Fubini's theorem but I am not sure how to proceed with that.
Does $G^{-1}$ exist? If $G^{-}$ is some sort of generalized inverse of $G$, then I could perhaps say $$0< y< G(x),0<x<\infty\implies G^{-}(y)< x<\infty,0<y<G(\infty)$$
Using integration by parts I think it is true that $$\int_0^\infty G(x)\,dF(x)=\int_0^\infty P(X\ge t)\,dG(t)$$
But then I am not sure how the conditions on $G$ come into play. Any hint would be great.