2

Prove that a sequence of RVs convergent in $L^2$ has a subsequence convergent a.s.

Let $(\Omega,\mathcal{F}, P)$ be a probability space and let $(X_n)$ be a sequence of RVs such that $X_i \in L^2$ for each $i=1,2,\ldots$ and $\lim_{n\rightarrow\infty}\mathbb{E}|X-X_n|^2 = 0$ for some random variable $X\in L^2$.

ANy hint how to start please?

luka5z
  • 6,359
  • 1
    Note that $L^2$-convergence implies convergence in probability and have a look at this question: http://math.stackexchange.com/q/1006091/36150 – saz Mar 27 '16 at 18:04

2 Answers2

4

Choose $(n_k)_{k\geq1}$ so that $$\mathbb{E}\left(\sum_{k=1}^\infty|X_{n_k}-X|^2\right)=\sum_{k=1}^\infty\mathbb{E}(|X_{n_k}-X|^2)<\infty$$

  • This is marvellous! My answer is based on the usual methods but your answer is the better one, definitely. Thank you! – Landon Carter Mar 27 '16 at 18:03
  • Could you explain what is the point here? I'm dumb:P – luka5z Mar 27 '16 at 18:06
  • @LandonCarter It is a trick worth knowing: if a non-negative random variable has finite expectation, then it is finite almost surely. It is so obvious, that it doesn't seem like it could be useful. –  Mar 27 '16 at 18:06
  • Yes I know that. I have seen examples of it while studying Markov Chains (ref: positive recurrent, null recurrent). But what you have done here is remarkable! I will now ask my friends to try to prove this theorem without the conventional way!! – Landon Carter Mar 27 '16 at 18:09
  • @luka5z You said that you only wanted a hint. ;) If $\sum_{k=1}^\infty |X_{n_k}-X|^2<\infty,$ what do you conclude about the sequence $(X_{n_k})_{k\geq 1}.$? –  Mar 27 '16 at 18:10
  • For a fixed $\omega$ it's convergent to $X$ – luka5z Mar 27 '16 at 18:11
2

Let $X_n$ be convergence in $L^2$ to $X$ i.e. $E|X_n-X|^2\to0$.

By Markov Inequality, $P(|X_n-X|>\epsilon)\leq\dfrac{E|X_n-X|^2}{\epsilon^2}\to0$ for any $\epsilon>0$ and thus $X_n\to X$ in probability.

Now, given $\epsilon>0$, for each $k$, get $n_k$ such that $P(|X_{n_k}-X|>\epsilon)\leq \dfrac{1}{2^k}$ (which you can get due to convergence in probability of $X_n$).

Then $\sum_{k\geq1}P(|X_{n_k}-X|>\epsilon)\leq \sum_{k\geq1}\dfrac{1}{2^k}<\infty$ so $X_{n_k}$ converges completely to $X$.

Define $Y_k=X_{n_k}$ then the above result is $\sum_kP(|Y_k-X|>\epsilon)<\infty$ for any $\epsilon>0$.

Thus, $P(Y_k\not\to X)=P[\cup_{\epsilon>0}\{|Y_k-X|>\epsilon\space i.o.\}]$ ***

So for any $\epsilon>0$, $P(|Y_k-X|>\epsilon \space i.o.)=P(\cap_n\cup_{m>n}|Y_m-X|>\epsilon)=\lim_n P(\cup_{m>n}|Y_m-X|>\epsilon)\leq\lim_n\sum_{m>n}P(|Y_m-X|>\epsilon)=0$.

Hence the sequence $Y_k$ converges a.s. to $X$. (Note the last proof is actually Borel-Cantelli).

***I have abused "countable union" by taking all $\epsilon>0$ but it is a simple exercise to verify that taking $\epsilon$ to be the rationals (which form countable set) is sufficient.

Landon Carter
  • 12,906
  • 4
  • 32
  • 79