2

What are the conditions on the sequence on $\{X_n\}$ (apart from the degenerate random variable), under which it can be claim that $||X_n-X||_{L^2(\mathbb{R})}\rightarrow 0$ implies $X_n\rightarrow X$, almost surely?

I know that there always exists subsequences along with the above implication hold (as in the second answer of this question). But I want to know about the convergence of the whole sequence by imposing some condition on it.

Thank in advance.

Janak
  • 1,065
  • which kind of conditions are you looking for? For example I think that requiring $X_n$ to converge almost surely to something is enough. But it depends on what you care about – Ant Jul 06 '15 at 08:08
  • 2
    If there is a $d>0$ such that $\sum_{n=1}^{\infty} E[|X_n-X|^d]<\infty$, then $X_n\rightarrow X$ almost surely. In particular, if $E[(X_n-X)^2]$ goes to zero sufficiently fast. – Michael Jul 06 '15 at 08:13
  • @Ant. I am looking for condition: like what Michael has suggested. – Janak Jul 06 '15 at 08:18
  • @Michael. Are you suggesting the condition for any $d$? – Janak Jul 06 '15 at 08:22
  • 2
    Any constant $d>0$, since (for a given $\epsilon>0$) we have $Pr[|X_n-X|>\epsilon] = Pr[|X_n-X|^d>\epsilon^d] \leq \frac{E[|X_n-X|^d]}{\epsilon^d}$. – Michael Jul 06 '15 at 08:23
  • 1
    $\mathbb{E}[\sup_{k\ge n}|X_k-X|]\rightarrow 0$ as $n\rightarrow \infty$ implies a.s. convergence. However, this is related to the previous suggestion... –  Jul 06 '15 at 08:23
  • 1
    A related fact is that if a sequence converges in $L^p$ then a subsequence converges almost surely. This essentially follows from what Michael said, because if a sequence of real numbers converges to zero then we can choose a subsequence which goes to zero arbitrarily fast. This fact is used to prove that $L^p$ is complete (given a Cauchy sequence in $L^p$, choose a "rapidly Cauchy" subsequence which has an a.e. limit, and then argue that this a.e. limit is the $L^p$ limit of the original sequence). – Ian Jul 06 '15 at 15:58

1 Answers1

4

Here are some conditions: Suppose $X$ is a random variable and $\{X_n\}_{n=1}^{\infty}$ are a sequence of random variables.

Claim 1: If for all $\epsilon>0$ we have $\sum_{n=1}^{\infty} Pr[|X_n-X|>\epsilon]<\infty$, then $X_n\rightarrow X$ with probability 1.

Claim 2: Suppose there is a constant $d>0$ such that: $$ \sum_{n=1}^{\infty} E[|X_n-X|^d] < \infty $$ Then the conditions of Claim 1 hold, and so $X_n\rightarrow X$ with probability 1.

Example: Suppose $E[(X_n-X)^2] \leq \frac{5}{n^{1.1}}$ for all $n \in \{1, 2, 3, \ldots\}$. Then $X_n\rightarrow X$ with probability 1, since $\sum_{n=1}^{\infty} \frac{5}{n^{1.1}} < \infty$.


Proof (Claim 1): Fix $\epsilon>0$. We want to show that $\lim_{M\rightarrow\infty} Pr\left[\cup_{n\geq M} \{|X_n-X|>\epsilon\} \right]=0$. By the union bound, we have for each positive integer $M$:
$$ Pr[ \cup_{n \geq M} \{|X_n-X|>\epsilon\}] \leq \sum_{n=M}^{\infty} Pr[|X_n-X|>\epsilon] $$ It suffices to show the right-hand-side converges to $0$ as $M\rightarrow\infty$. But this is implied by the assumption $\sum_{n=1}^{\infty} Pr[|X_n-X|>\epsilon] < \infty$, since the limit of the tail-sum of a finitely-summable sequence is zero. $\Box$

Proof (Claim 2): Fix $\epsilon>0$. For each positive integer $n$: $$ Pr[|X_n-X|>\epsilon] = Pr[|X_n-X|^d>\epsilon^d] \leq \frac{E[|X_n-X|^d]}{\epsilon^d} $$ where the last inequality is the Markov inequality. Hence: $$ \sum_{n=1}^{\infty} Pr[|X_n-X|>\epsilon] \leq \frac{1}{\epsilon^d}\sum_{n=1}^{\infty} E[|X_n-X|^d] < \infty $$ Then Claim 1 implies $X_n\rightarrow X$ with probability 1. $\Box$

Michael
  • 23,905
  • Note: The proof of claim 1 is similar to the proof of the Borel-Cantelli Lemma, which says that if ${\mathcal{A}n}{n=1}^{\infty}$ is a sequence of events and $$\sum_{n=1}^{\infty} Pr[\mathcal{A_n}] < \infty$$ then, with prob 1, only a finite number of the events occur. – Michael Jul 06 '15 at 13:09
  • Thank you so much for the answer. – Janak Jul 06 '15 at 13:30