15

A sequence of random variables $\{X_n\}$ converges to $X$ in probability if for any $\varepsilon > 0$, $$P(|X_n-X| \geq \varepsilon) \rightarrow 0$$

They converge in distribution if $$F_{X_n} \rightarrow F_X$$ at points where $F_X$ is continuous.

(There is another equivalent definition of converge in distribution in terms of weak convergence.)

It seems like a very simple result, but I cannot think of a clever proof.

Davide Giraudo
  • 172,925
Hawii
  • 1,279
  • 2
    Have you tried the wikipedia article: http://en.wikipedia.org/wiki/Proofs_of_convergence_of_random_variables#propA2 ? Most books on probability theory include a proof. – Gautam Shenoy Nov 14 '12 at 04:54
  • Oh, how come I didn't find it! It looks like something I have in mind. Thank you so much! – Hawii Nov 14 '12 at 05:34

3 Answers3

31

A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n \Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) \to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n \to X$ in probability implies $f(X_n) \to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| \to 0$, which implies the result.

7

Here is an answer that does not rely on dominated convergence.

To prove convergence in distribution, we only need to establish that $E[f(X_n)]$ converges to $E[f(X)]$ for bounded continuous functions $f$. By definition of the limit, we need to prove that for any $\epsilon>0$, there some $n_0=n_0(\epsilon)$ such that for all $n>n_0$ the inequality $| E[f(X_n)] - E[f(X)]| < \epsilon $ holds.

  1. As suggested in another answer, the first step is to show that if $X_n$ converge to $X$ in probability then $f(X_n)$ also converges in probability to $f(X)$ for any continuous $f$.
  2. Let $f$ be any continuous function bounded by $K$. Take any $\epsilon>0$ and show that $$| E[f(X_n)] - E[f(X)]| \le E[|f(X_n)] - E[f(X)|] \le (\epsilon/2) \; P(A_n^c) + K \; P(A_n)$$ where $A_n$ is the event $\{ |f(X_n)] - E[f(X)| > \epsilon /2 \}$.
  3. It remains to show that $P(A_n^c)\le 1$ (obvious) and that for $n$ large enough, one has $P(A_n^c)\le \epsilon/(2 K)$ thanks to the convergence in probability established in 1.
jlewk
  • 1,867
5

Chris J.'s answer more or less is correct, but you require almost sure convergence to be able to apply dominated convergence. Fortunately, convergence in probability implies almost sure convergence along a subsequence, and the proof more or less can proceed as desired.

For more details, Kallenberg's Foundations of Modern Probability, First Edition, Lemma 3.7 is useful.

Roy D.
  • 999
  • 2
    Or you could apply the bounded convergence theorem. – Calculon Mar 15 '15 at 11:33
  • 9
    Dominated convergence theorem also applies with convergence in probability. – perlman Oct 29 '17 at 00:26
  • 1
    Why is it sufficient for a subsequence to converge in probability? – Martin Erhardt Apr 26 '20 at 18:34
  • 1
    @MartinErhardt Because you can go by a subsequential and a "further" subsequential argument. Suppose $f_{n}\to f$ in P and suppose $|f_{n}|\leq g$ for some integrable $g$. Then given any subsequence $\int f_{n_{k}}$, you can find a further subsequence of $f_{n_{k}}$ say $f_{n_{k_{l}}}\xrightarrow{a.s.}f$. Thus along this subsequence $\int f_{n_{k_{l}}}\to \int f$ by usual DCT. Thus $\int f_{n}\to\int f$. This is just by a theorem that in a metric space, a sequence $x_{n}\to x$ if and only if given any subsequence $x_{n_{k}}$, we have a further subsequence $x_{n_{k_{l}}}\to x$. – Mr.Gandalf Sauron Jul 09 '23 at 07:31