Let $(X_n)_{n\ge 1}$ be a sequence of pairwise uncorrelated random variables. If their second moments are uniformly bounded then $$\frac{S_n-E(S_n)}{n}\overset{a.s.}{\to} 0$$ I know how to for convergence in $L^2$, but no clue for the almost sure case. I would appreciate some hint.
-
1A first idea to try is to apply Borel-Cantelli. It "almost" works, but do you see how to start? – md5 Jun 02 '21 at 20:19
-
I know it is enough to prove that $\sum_{n\ge 1} P[\vert \frac{S_n-E(S_n)}{n}\vert > \varepsilon]<\infty$ for every positive $\varepsilon$, but I did not manage to prove it. – Curious Jun 02 '21 at 21:08
-
I thought about using $P[\vert \frac{S_n-E(S_n)}{n}\vert > \varepsilon]\le P[\vert \frac{S_n}{n}\vert > \frac{\varepsilon}{2}]+P[\vert \frac{E(S_n)}{n}\vert > \frac{\varepsilon}{2}]$ but I don´t know if it helps. How can apply the properties I know about the expectations? – Curious Jun 02 '21 at 21:10
-
To simplify, we can try to prove it assuming, say, $\mathbb{E} X_i=0$ and $\mathbb{E} X_i^2\le 1$ (second moment uniformly bounded by one). In that setting you want to estimate $P(|S_n/n|>\varepsilon)$. Can you find some upper bound with your uncorrelation/second moment assumptions? – md5 Jun 02 '21 at 22:42
-
I tried to apply Chebyshev's inequality, but I could only prove that this probabilty is bounded above by $\frac{1}{n}$, which is not useful. – Curious Jun 02 '21 at 23:59
-
1Right, that's why it does not "exactly" work (but if we had something slightly lower than $1/n$, it would converge). Now someone has given the full answer below, but you can see that the idea is to look instead at an extracted subsequence of $S_n/n$ (say $S_{n^{1.01}}/n^{1.01}$) to be able to use Chebyshev upper bound, and then bound the increase that you did not take into account between $n^{1.01}$ and $(n+1)^{1.01}$ – md5 Jun 03 '21 at 00:03
-
I guess the person has deleted the answer, but I'll try to follow your path – Curious Jun 03 '21 at 00:14
1 Answers
I prove the following theorem:
Theorem
Let $(X_n)_{n\ge 1}$ be a sequence of pairwise uncorrelated random variables which are bounded from below by a constant. If their second moments are uniformly bounded then $$\frac{S_n-E(S_n)}{n}\overset{a.s.}{\to} 0$$
Without loss of generality, let $$\mathbb{E}(X_n^2) \le 1 $$ for all $n$. So $$\mathbb{E}(|X_n|) \le 1 \quad \forall n$$
Also,WLOG: $X_n$ are nonnegative random variables,i.e
$$X_n \ge 0$$
for all $n$
For all $\epsilon >0$, we have:
$$\begin{align} &\sum_{n \ge 1} \mathbb{P} \left( \left| \frac{S_{n^2}-\mathbb{E}(S_{n^2})}{n^2} \right|> \epsilon \right) \stackrel{\text{Markov's}}{\le}& \frac{1}{\epsilon^2} \sum_{n \ge 1} \mathbb{E} \left( \left| \frac{S_{n^2}-\mathbb{E}(S_{n^2})}{n^2} \right|^2 \right) \stackrel{\text{pairwise uncorr.}}{\le} \frac{1}{\epsilon^2}\sum_{n\ge 1}\frac{1}{n^2} < \infty \end{align} $$ Thus, $$\lim_{n} \frac{S_{n^2}-\mathbb{E}(S_{n^2})}{n^2} = 0 \quad \text{a.s}$$ Now for any $m \in \mathbb{N}$, let $n$ be the positive integer such that $n^2 \le m <(n+1)^2$ (thus $n= \lfloor \sqrt{m}\rfloor$), we see that: $$S_m - \mathbb{E}(S_m)= S_{n^2}-\mathbb{E}(S_{n^2})+\sum_{k=n^2+1}^m \left( \underbrace{ X_k}_{ \ge 0} -\underbrace{ \mathbb{E}(X_k)}_{ \le 1}\right) \ge S_{n^2}-\mathbb{E}(S_n^2)-(m-n^2) \ge \left(S_{n^2}-\mathbb{E}(S_n^2)\right) -2n$$ So $$\liminf_{m\rightarrow \infty} \frac{S_m -\mathbb{E}(S_m)}{m} \ge \liminf_{n=[\sqrt{m}]; m\rightarrow +\infty}\frac{n^2}{m}\left( \frac{S_{n^2} -\mathbb{E}(S_{n^2})}{n^2} -\frac{2}{n}\right) = 0 \quad \text{a.s}$$ Similarly, we can show that: $$S_m - \mathbb{E}(S_m) \le S_{(n+1)^2}-\mathbb{E}(S_{(n+1)^2})+2n+1$$ and then deduce that: $$\limsup_{m\rightarrow \infty} \frac{S_m -\mathbb{E}(S_m)}{m} \le 0 \quad \text{a.s}$$
Hence $$\frac{S_n-\mathbb{E}(S_n)}{n} \xrightarrow[]{n \rightarrow \infty} 0 \quad \text{a.s}$$
$\square$
P.s: I undeleted my answer by because it seems like someone might need it

- 7,331
-
Just to make sure I understand: $\sum_{n \ge 1} \mathbb{P} \left( \left| \frac{S_{n^2}-\mathbb{E}(S_{n^2})}{n^2} \right|> \epsilon \right)<\infty$ implies $ \mathbb{P} \left( \left| \frac{S_{n^2}-\mathbb{E}(S_{n^2})}{n^2} \right|> \epsilon \text{ for infinitely many } n \right)=0$ by Borel-Cantelli, and using the argument from https://math.stackexchange.com/a/2079377/522332 we get that $\lim_{n} \frac{S_{n^2}-\mathbb{E}(S_{n^2})}{n^2} = 0$ a.s. ? – Alphie Jun 09 '21 at 15:55
-
And for the $\limsup$ part we need to use the fact that $\frac{(n+1)^2}{m}\leq (1+\frac{1}{n})^2 \to 1$ as $m\to \infty$ so that $\frac{(n+1)^2}{m}$ is bounded right? – Alphie Jun 09 '21 at 16:08
-
Yeah, it is Borel-Cantelli and the limit you are talking about. – Paresseux Nguyen Jun 09 '21 at 16:37
-
Ok and we need the boundedness of $\frac{(n+1)^2}{m}$ right? For the $\limsup$ part I mean. – Alphie Jun 09 '21 at 16:44
-
-
I get $ \limsup_{m\rightarrow \infty} \frac{S_m -\mathbb{E}(S_m)}{m}\leq \limsup_{m\rightarrow \infty} \frac{(n+1)^2}{m} \bigg( \frac{ S_{(n+1)^2}-\mathbb{E}(S_{(n+1)^2})}{(n+1)^2}+\frac{2n+1}{(n+1)^2} \bigg)$ and from there I need to use that $\frac{(n+1)^2}{m}\leq (1+\frac{1}{n})^2 \to 1$ as $m\to\infty$ no? – Alphie Jun 09 '21 at 16:54
-
I didn't check your calculation but it looks generally correct. – Paresseux Nguyen Jun 09 '21 at 16:56
-
-
I don't know. I just came up with it, but I think it's a well-known result – Paresseux Nguyen Jun 09 '21 at 18:17