2

Let $(X_{n})_{n}$ be independent, identical random variables such that $X_{n}$~$\exp(1)$

Show that $\limsup_{n \to \infty}\frac{X_{n}}{\log(n)}=1$ a.s.

I am given a hint to look at $P(\frac{X_{n}}{\log(n)}\geq 1 \pm\delta)$

My ideas:

When ever looking at a.s. convergence I see Borel-Cantelli as a helpful tool:

$\sum_{n \in \mathbb N}P(\frac{X_{n}}{\log(n)}\geq 1 \pm\delta)=\sum_{n \in \mathbb N}P(X_{n}\geq \log(n)( 1 \pm\delta))=\sum_{n \in \mathbb N}\exp(-\log(n)( 1 \pm\delta))$

and then in the case of "$+$"

$\sum_{n \in \mathbb N}\exp(-\log(n)( 1 \pm\delta))=\sum_{n \in \mathbb N}\exp(\log(\frac{1}{n})( 1 +\delta))=\sum_{n \in \mathbb N}\exp(\log(\frac{1}{n}))\exp(\log(\frac{1}{n})^\delta)=\sum_{n \in \mathbb N}(\frac{1}{n})^{\delta+1}$

But this is $< \infty$ and secondly I have not even been able to consider the case "$-$"

And then another question, why am I asked for $\limsup$ rather than $\lim$, surely that implies that I need to find the largest greatest convergent subsequence converges to $1$

Any help is greatly appreciated.

SABOY
  • 1,828

1 Answers1

1

Let $A_n$ be the event $\{X_n\geqslant \left(1+\delta\right)\log n\}$. Your approach showed that the series $\sum_n \mathbb P(A_n)$ is convergent hence the probability of $\limsup_n A_n$ is zero. This proves that for almost every $\omega$, there is an integer $N(\omega)$ such that $\omega\notin A_n$ for all $n$ larger than $N(\omega)$ which proves that $\limsup_{n\to +\infty}\frac{X_n}{\log n} \leqslant 1+\delta$.

Until this point, the independence was not used. Now, letting $B_n:= \{X_n\geqslant \left(1-\delta\right)\log n\}$, using analogous computations shows that $\sum_n \mathbb P(B_n)$ is divergent. From the so-called second Borel-Cantelli lemma, we know that $\limsup_n B_n$ has probability one hence for almost all $\omega$, the inequality $X_n\left(\omega\right)\geqslant \left(1-\delta\right)\log n$ holds for infinitely many $n$. I think you can conclude from this.

Davide Giraudo
  • 172,925