3

Let $\{X_n\}$ be a collection of positive random variable with $X_n \rightarrow X$ in probability. Prove that if $E(X_n) \rightarrow E(X)$, then $X_n \rightarrow X$ in $L^1$.

My partial answer: Let $(\Omega,B,P)$ be the probability space. For any $\varepsilon>0$, we have \begin{align*} \int_{\Omega} |X_n-X| dP &= \int_{\{|X_n-X|<\varepsilon\}} |X_n-X| dP+\int_{\{|X_n-X|\geq \varepsilon\}} |X_n-X| dP \\ &<\varepsilon P(|X_n=X|<\varepsilon)+\int_{\{|X_n-X|\geq \varepsilon\}} |X_n-X| dP \\ &<\varepsilon +\int_{\{|X_n-X|\geq \varepsilon\}} |X_n-X| dP \end{align*} The problem is I don't know how to estimate the second term in RHS with information convergence in probability and expectation convergence.

Thanks for your help.

tes
  • 395
  • 1
    I'm not sure your approach will easily work. But, one way to prove your statement is to apply the Dominated Convergence Theorem to the sequence $(g_n)$, where $g_n=|X_n-X|+X-X_n$. Note $0\le g_n\le 2X$ for each $n$. – David Mitra May 02 '13 at 01:17
  • $X_n$ doesn't converge pointwise to $X$ but only converge in probability. Meanwhile, the hypothesis of DCT need $X_n$ converge pointwise to $X$ – tes May 02 '13 at 01:40
  • 1
    DCT holds also for convergence in measure. See here. – David Mitra May 02 '13 at 01:41
  • Also $E(X)$ need not be finite. – tes May 02 '13 at 01:51

1 Answers1

3

Let $Y_n:=|X-X_n|+X-X_n$. Then $0\leqslant Y_n\leqslant 2X$. As $E(X_n)\to E(X)$, we just need to show that $E(Y_n)\to 0$. As $X_n\to X$ in measure, we have $Y_n\to 0$ in measure. We can conclude by dominated convergence.

Davide Giraudo
  • 172,925