19

I'm looking for an elementary way of showing the following. If $(X_n)$ and $X$ are random variables such that $X_n \to X$ in distribution and such that $\{X_n\mid n\geq 1\}$ are uniformly integrable, then $E[X_n]\to E[X]$.

I've seen another topic on this, but the solution given there is using Skorokhod's theorem stating that convergence in distribution is equivalent to almost-sure convergence of copies of the random variables in some abstract probability space. I would like to do without that if possible. Thanks in advance!

Stefan Hansen
  • 25,582
  • 7
  • 59
  • 91
  • Isn't the probability space for which almost-sure convergence holds pretty explicit and concrete here, i.e., $((0,1),\mathcal B, \mathbb P)$ where $\mathbb P$ is Lebesgue measure? – cardinal Feb 24 '12 at 08:05
  • To me isn't not obvious that this is the probability space to work on, and furthermore I would like to this exercise without using that equivalence. – Stefan Hansen Feb 24 '12 at 08:42
  • That's fine, I understand (and appreciate!) that you are looking for an exercise not using this theorem. I just wanted to point out that the space is really not exotic. The construction is achieved just as you'd expect, by letting $Y_n$ be the generalized inverse of $F_n$ (the distribution of $X_n$). – cardinal Feb 24 '12 at 09:34

2 Answers2

15

For any random variable $Z$ and any real number $x\geqslant0$, let $Z^x=\max\{-x,\min\{Z,x\}\}$. Let $\mathcal X=\{X_n\mid n\geqslant1\}$. Here are some steps of a proof:

  1. If $X_n\to X$ in distribution, then, for every $x\geqslant0$, $\mathrm E(X_n^x)\to \mathrm E(X^x)$.
  2. If $\mathcal X$ is uniformly integrable and $X_n\to X$ in distribution, then $X$ is integrable and $\mathcal X\cup\{X\}$ is uniformly integrable.
  3. If $\mathcal Y$ is uniformly integrable, then for every $\varepsilon\gt0$, there exists a finite $x\geqslant0$ such that $\mathrm E(|Y-Y^x|)\leqslant\varepsilon$ for every $Y$ in $\mathcal Y$.
  4. Prove the triangular inequality, valid for every $n\geqslant1$ and $x\geqslant0$: $$ |\mathrm E(X_n)-\mathrm E(X)|\leqslant\mathrm E(|X_n-X_n^x|)+|\mathrm E(X_n^x)-\mathrm E(X^x)|+\mathrm E(|X-X^x|). $$
  5. Conclude.
Did
  • 279,727
  • If any of these steps is still a problem, please say so. – Did Feb 24 '12 at 07:14
  • If 2. holds why is the following not true? We have that $X_n-X \to 0$ in probability and as ${X_n-X\mid n\geq 1}$ is uniformly integrable, we have that $X_n-X\to 0$ in $L^1$. – Stefan Hansen Feb 24 '12 at 07:20
  • Thanks alot for the help. I think I got through everything except 2. Could you give a hint on showing that $X$ is integrable? – Stefan Hansen Feb 24 '12 at 09:11
  • Re 2.: assume without loss of generality that $X_n\geqslant0$ almost surely for every $n$. By step 1., for every $x$, $\mathrm E(X^x)\leqslant\sup_n\mathrm E(X_n^x)\leqslant K$ with $K=\sup_n\mathrm E(X_n)$ which is finite since $\mathcal X$ is UI. Hence $\mathrm E(X)\leqslant K$. (Re your other comment, it is quite possible that some shortcuts exist, which allow to bypass one step or another.) – Did Feb 24 '12 at 16:23
1

Another solution: Assume $X_n, X \geq 0$ (once we prove this case, the general case follows by taking $X = X^+ - X^-$ and applying the continuous mapping theorem). We will show that $\liminf_{n \to \infty} \mathbb E\left[X_n - X\right] \geq 0$ and $\limsup_{n \to \infty} \mathbb E\left[X_n - X\right] \leq 0$.

Lemma: If $X_n \xrightarrow{\mathcal D} X$, then $\mathbb E\left[|X|\right] \leq \liminf_{n \to \infty} \mathbb E\left[|X_n|\right]$.

Proof: Convergence in distribution is equivalent to $\mathbb E\left[f(X_n)\right] \to \mathbb E\left[f(X)\right]$ for any continuous and bounded $f : \mathbb R\to\mathbb R$. Consider the functions $f_m(x) = |x|$ if $|x|\leq m$ and $f_m(x) = m$ otherwise; then $f_m$ is continuous and bounded. And, $f_m(x) \uparrow |x|$. So by the monotone convergence theorem: : $$ \mathbb E[|X|] = \lim_{m \to \infty} \mathbb E\left[f_m(X)\right] = \lim_{m \to \infty} \lim_{n \to \infty} \mathbb E\left[f_m\left(X_n\right)\right] \leq \liminf_{n \to \infty} \lim_{m \to \infty} \mathbb E\left[f_m\left(X_n\right)\right] = \liminf_{n \to \infty} \mathbb E\left[|X_n|\right]. $$ This proves the lemma. $\square$

We proceed. Let $\epsilon > 0$. Since $(X_n)$ is uniformly integrable, there is an $a \geq 0$ for which $\mathbb E\left[(X_n - a)^+\right] \leq \epsilon$ for all $n \in \mathbb N$. By the Continuous Mapping Theorem, $(a - X_n)^+ \xrightarrow{\mathcal D} (a-X)^+$, so by the Lemma, \begin{align*} \limsup_{n \to \infty} \mathbb E\left[(X_n \wedge a)-X\right] &\leq \mathbb E\left[a-X\right] + \limsup_{n \to \infty} \mathbb E\left[(X_n \wedge a) - a\right] \\ &= \mathbb E\left[a-X\right] + \limsup_{n \to \infty} \mathbb E\left[-(a-X_n)^+\right] \\ &= \mathbb E[a-X] - \liminf_{n \to \infty} \mathbb E\left[(a-X_n)^+\right] \\ &\leq \mathbb E\left[a-X\right] - \mathbb E\left[(a-X)^+\right] \\ &\leq 0, \end{align*} where $A \wedge B := \min\{A,B\}$. Since $X_n = (X_n - a)^+ + (X_n \wedge a)$, $$ \limsup_{n \to \infty} \mathbb E[X_n - X] \leq \limsup_{n \to \infty} \mathbb E\left[(X_n - a)^+\right] + \limsup_{n \to \infty} \mathbb E\left[(X_n \wedge a) - X\right] \leq \epsilon. $$ Since $\epsilon > 0$ was arbitrary, $\limsup_{n \to \infty} \mathbb E[X_n - X] \leq 0$. On the other hand, again by the Lemma, $$ \liminf_{n \to \infty} \mathbb E\left[X_n - X\right] = \liminf_{n \to \infty} \mathbb E[X_n] -\mathbb E[X] \geq 0. $$ The result now follows.

D Ford
  • 3,977