9

I want to show that if $\{X_n\}$ is a sequence of random variables such that:

(1) $\exists X$ (measurable) such that $X_n \xrightarrow{P} X$

(2) $\exists Y$ with $E(|Y|) < \infty$ such that $|X_n| \le Y \quad \forall n \in \mathbb{N}$

Then $E(X_n) \rightarrow E(X)$ and $E(|X_n - X|) \rightarrow 0$

How do I go about showing this? This is of course the standard dominated covergence theorem changing almost sure convergence to $X$ to convergence in probability. The approach to proving the standard theorem involves showing that $X$ is integrable and subsequently bounding $E(|X_n - X|)$, but I don't even understand how to do that part in this new set of assumptions. Can someone help with this? Once that part is done then the theorem is complete I believe? I'm aware this might be a duplicate but I cannot understand these things very well and I'm very confused. Thanks for any help you can give!

qx123456
  • 575

2 Answers2

6

Since $X_n$ converges in probability to $X$, it has a subsequence $X_{n_k}$ that converges almost surely to $X$. Letting $k\to \infty$ in $|X_{n_k}|\leq Y$ proves that $|X|\leq Y$, hence $X$ is integrable.

As a result, $|X_n-X|\leq 2Y$, hence $E(|X_n-X|)$ is bounded. Let us prove that the sequence $E(|X_n-X|)$ has $0$ as its only accumulation point. Suppose $E(|X_{m_k}-X|)$ converges to some $\ell$. Since $(X_{m_k})_k$ still converges in probability, it has a subsequence $(X_{m_{p_j}})_j$ that converges almost surely to $X$, hence $|X_{m_{p_j}}-X|$ converges a.s. to $0$ and $|X_{m_{p_j}}-X|\leq 2Y$. The dominated convergence theorem applies and yields $$\lim_j E(|X_{m_{p_j}}-X|) = 0$$

Hence $\ell=0$ and we conclude $\lim_n E(|X_n-X|) = 0$. By the reverse triangle inequality, $\lim_n E(X_n) = E(X)$.

Gabriel Romon
  • 35,428
  • 5
  • 65
  • 157
3

One can show that $\mathbb E[|X_n - X|]$ is small by the usual analytical trick of breaking this up into two parts: one part where $|X_n - X|$ is small and one of small measure where the difference is only controlled by $Y$.

To be more precise, let's fix any $\varepsilon>0$ and define a set of bad points $B_{n,\varepsilon}=\{z:|X_n(z) - X(z)| > \varepsilon \}$ and a set of good points $G_{n,\varepsilon}$ to be its complement. Convergence in probability exactly says that $$\lim_{n\rightarrow\infty}P(B_{n,\varepsilon}) = 0$$ for each $\varepsilon>0$. Note that we can write, using indicator functions $1_S$: \begin{align*}\mathbb E[|X_n - X|] &= \mathbb E[1_{G_{n,\varepsilon}}\cdot |X_n - X|] + \mathbb E[1_{B_{n,\varepsilon}}\cdot |X_n - X|]\\&\leq \varepsilon + \mathbb E[1_{B_{n,\varepsilon}}\cdot |X_n - X|]\\&\leq \varepsilon + 2\mathbb E[1_{B_{n,\varepsilon}}\cdot Y].\end{align*} where we first use that the difference of these quantities is no more than $\varepsilon$ on $G_{n,\varepsilon}$ - so the expectation is no more than $\varepsilon$ - and then that, in any case $|X_n - X|$ is no more than $2Y$ since both $X_n$ and $X$ are (almost everywhere) less than $Y$.

Then, all we need is a lemma:

Define $$M_{\varepsilon,Y}=\sup_{P(S)=\varepsilon} \mathbb E[1_S\cdot Y].$$ It is true that $\lim_{\varepsilon\rightarrow 0}M_{\varepsilon,Y}=0$ for any $Y$ with finite expectation.

Essentially, this lemma says that expectation can't concentrate too heavily on sets of low probability. We can prove this via the usual dominated convergence theorem*: If it were not true, we could produce a sequence of sets $S_1,S_2,\ldots$ such that $P(S_n)=1/2^n$ but such that $\lim_{n\rightarrow\infty}\mathbb E[1_{S_n}\cdot Y]$ was not zero - but this violates the dominated convergence theorem because the sequence $1_{S_n}\cdot Y$ converges pointwise almost everywhere to zero due to the Borel-Cantelli lemma.

This in hand, we can take our inequality one step further: $$\mathbb E[|X_n - X|] \leq \varepsilon + 2M_{\varepsilon,Y}$$ and then take the lim sup with $n$ on both sides: $$\limsup_{n\rightarrow\infty}\mathbb E[|X_n - X|] \leq \varepsilon$$ and since this inequality holds for all $\varepsilon>0$, we get $$\lim_{n\rightarrow\infty}\mathbb E[|X_n - X|] = 0.$$


*Of course, this is a bit lazy - we need this lemma to prove dominated convergence too! You can get also get the lemma out of the monotone convergence theorem which pretty directly tells you that not too much of the area of an integrable function lies above a rising threshold. It's also possible to use the Radon-Nikodim theorem to show this by constructing a measure zero set on which $Y$ would have to have positive integral if the lemma failed (which is a contradiction).

Milo Brandt
  • 60,888
  • I'm fine with most of this but how do we know that $|X| \le Y$ a.s.? The assumptions are just that $|X_n| \le Y$ a.s. – qx123456 Sep 29 '19 at 22:59
  • @qx123456 You can get that by contradiction: If there were a set $S$ of positive probability on which $|X|>Y$, there would have to be some $\varepsilon>0$ such that the set $S'$ on which $|X|>Y+\varepsilon$ also had positive probability. However, it would not be the case for any $n$ that $|X_n-X| \leq \varepsilon$ on $S'$, which would prevent convergence of $X_n$ to $X$ in probability because the probability that $|X_n-X|>\varepsilon$ would be bounded below by the probability of $S'$. – Milo Brandt Sep 29 '19 at 23:02
  • Thanks!! We can also bound the part with $E(Y*1_{B_{n,\epsilon}})$ using the fact that $Y$ is uniformly integrable (since it is a finite family of integrable functions obviously), right? – qx123456 Sep 29 '19 at 23:04
  • @qx123456 Yes, that works too (although relies on proving a similar lemma about uniformly integral families) – Milo Brandt Sep 29 '19 at 23:08
  • good works for nonmeasurable non absolutely henstok kurzweil integrable vector mappings with vector measure – Anil Pedgaonkar Mar 10 '21 at 19:00