1

I want to prove the following:

Consider a sequence of $(Z_n)_{n \in \mathbb{N}}$ of independent random variables such that $Z_n \sim \mathcal{N}(0,\sigma_n^2)$. Now let $S_n = \sum_{j=1}^n Z_i$ and $\gamma_n^2 := \sum_{j=1}^n\sigma_j^2$ and assume that $\lim_{n \to \infty} \gamma_n^2 < \infty$. Then for any $p > 2$ $\lim_{n \to \infty} n^{-1/p}S_n=0$ a.s.

Now, my first observation was that $S_n \sim \mathcal{N}(0,\gamma_n^2)$ as a sum of independent random variables and thus $S_n' := n^{-1/p}S_n \sim \mathcal{N}(0,\frac{\gamma_n^2}{n^{2/p}})$. Hence for any $\varepsilon > 0$ by the Markov inequality we get for $q \in \mathbb{N}$ with $q \geq p$

$\mathbb{P}(\vert S_n' \vert > \varepsilon) \leq \frac{\mathbb{E}(\vert S_n'\vert^q)}{\varepsilon^q} \leq \frac{\mu_q}{\varepsilon^q} \frac{(\gamma_n^{2})^q}{n^{2q/p}} \leq \frac{\mu_q}{\varepsilon^q} \frac{(\gamma_n^{2})^q}{n^{2}} $

where $\mu_q$ is the q-th moment of the standard Gaussian, see here. Thus

$\sum_{n=1}^\infty \mathbb{P}(\vert S_n' \vert > \varepsilon) \leq \frac{\mu_q}{\varepsilon^q} \sum_{n=1}^\infty\frac{(\gamma_n^{2})^q}{n^{2}} < \infty$

since $ \gamma_n^2 < c$ for sufficient large $n$, which implies $\lim_{n \to \infty} S_n' = 0$ a.s. by the Borel-Cantelli lemma.

First question: Does this look correct? And second: Is there a more elegant way to show that?

  • A maybe more general way is to use Kolmogorov series theorem combined with Kronecker's lemma, as used to prove the strong law of large numbers. Another way is to use the normal tail bound: for $x > 0$, $$1 - \Phi(x) \leq \frac{1}{x}e^{-x^2/2}.$$ – Mason May 07 '22 at 02:24

1 Answers1

1

Actually, $S'_n$ has the same distribution as $ \gamma_n n^{-1/p}N$, where $N$ has a standard normal distribution hence $$ \mathbb E\left[\left\vert S'_n\right\rvert^q\right]=\left(\gamma_n n^{-1/p}\right)^q\mu_q\leqslant \left(\sup_{\ell\geqslant 1}\gamma_k\right)^q \mu_qn^{-q/p}$$ hence you have to choose $q>p$.

Actually, it is possible to show that $\left(S_n\right)_{n\geqslant 1}$ converges almost surely (by this result, it suffices to show the convergence in probability, which can be done by showing the convergence in $\mathbb L^2$ for instance).

Therefore,$S_n/n^{\beta}\to 0$ almost surely for each positive $\beta$.

Davide Giraudo
  • 172,925
  • First of all: Thanks. To show almost sure convergence i have the following idea: I think I remember a result of the kind: $S_n \sim \mathcal{N}(0,\gamma_n^2)$ and $\lim_{n \to \infty} \gamma_n^2 = c$ then already $S_n \to S \sim \mathcal{N}(0,c)$ in distribution. Then $S_n - S$ is again normal distributed and hence I could use Borel-Cantelli to show that $S_n - S \to 0$ almost surely. Is that the right path? – Student1369321 May 09 '22 at 16:02
  • I think that you need stuff like this result https://math.stackexchange.com/questions/269202/convergence-of-sum-of-independent-random-variables . For what you propose, you need convergence of $\sum \mathbb E[(S_n-S)^2]$ which may not hold. – Davide Giraudo May 09 '22 at 21:50