2

Consider a sequence of real-valued random variables $\{X_n\}_{\forall n \in \mathbb{N}}$ and a real-valued random variable $X$

All r.v. are defined on the probability space $ (\Omega, \mathcal{F}, P)$

Could explain what is the relation (equivalent, one implies the other, etc) between

$$ (1) \hspace{1cm}X_n \rightarrow_{a.s.} X \text{ as $n\rightarrow \infty$} $$

and $$ (2) \hspace{1cm}X_n =X \text{ with probability approaching 1 as $n\rightarrow \infty$} $$


Some considerations:

Using the definitions,

(1) $P(\omega \in \Omega \text{ s.t. } \lim_{n\rightarrow \infty}X_n(\omega)=X(\omega))=1$

(2) $\lim_{n\rightarrow \infty} P(\omega \in \Omega \text{ s.t. } X_n(\omega)=X(\omega))=1$

So (1) has the limit inside, (2) has the limit outside.

(2) may look very similar to $X_n\rightarrow_pX$, where $X_n\rightarrow_pX$ means that $\forall \epsilon>0$ $\lim_{n\rightarrow \infty} P(\omega \in \Omega \text{ s.t. } X_n-X\leq \epsilon)=1$

What can we deduce from here?

BCLC
  • 13,459
Star
  • 222

1 Answers1

1

In fact, $(2)$ is stronger than convergence in probability. (Assume $(2)$ and take $\epsilon > 0$. Then $\{X_n = X\} \subseteq \{|X_n - X| < \epsilon\}$.)

Under the given conditions, there are no implications between these statements. Take $(\Omega, \mathcal{F}, P) = ([0,1], \mathcal{B}([0,1]), \lambda)$. i.e. the Borel sets on $[0,1]$ equipped with the Lebesgue measure.

  1. when both statements are true: simply take $X_n = c$, i.e. be a constant sequence of constant random variable.
  2. $(1)$ holds but $(2)$ does not: adapt the textbook example for . Take $X_n(\omega) = \omega^n$ and $X \equiv 0$.
    • $X_n \overset{a.s.}{\to} X$
    • $\forall n \in \Bbb{N}, P(X_n = X) = P(\{0\}) = 0$
  3. $(2)$ holds but $(1)$ does not: adapt the classic "sliding bump" functions for showing that convergence in measure doesn't implies convergence almost everywhere. Take $Y_{m,n} = 1_{\left[\frac{m-1}{n}, \frac mn \right]}$ for $m \in \{1,\dots,n\}$ and $n \in \Bbb{N}$. i.e. $Y_{m,n}$ is a "bump $\left[\frac{m-1}{n}, \frac mn \right]$ of width $\frac1n$ sliding from left to right as $m$ runs from $1$ to $n$". Enumerate $(Y_{m,n})$ as $\{Y_{1,1},Y_{2,1},Y_{2,2}, \dots\}$ and denote this sequence of random variables as $X_k$.
    • $X_k$ oscillates almost surely as every point on $[0,1]$ is "visited by infinitely many bumps". For all $\omega \in [0,1]$, choose $n \in \Bbb{N}$. As the "bumps" $\left[\frac{m-1}{n}, \frac mn \right]$ cover the whole space $[0,1]$, there exists an $m$ so that $Y_{m,n}(\omega) = 1$. Therefore, $Y_{m,n}(\omega) = 1$ infinitely often. Idem for $Y_{m,n}(\omega) = 0$, so $(1)$ doesn't hold.
    • $P(Y_{m,n} = 0) = \dfrac{n-1}{n} \xrightarrow[n\to\infty]{} 1$, so $(2)$ follows.

Remarks: $(2) \implies (1)$ is "partially true", in the sense that $(2)$ implies convergence in probability implies convergence of a subsequence almost surely.