1

Assume the sequence of random variables $X_1, X_2, \cdots$ are IID with finite mean and finite variance. Define a random variable:

\begin{align} Y_n = \frac{X_n}{n} \end{align} Show that $Y_n \to 0$ almost surely.

To converge to some value almost surely implies: \begin{align} \mathbb{P}\left( \lim_{n \to \infty}{Y_n} = 0\right) = 1 \end{align} By the way $Y_n$ is defined, this is equivalent to: \begin{align} \mathbb{P}\left( \lim_{n\to \infty}{\frac{X_n}{n}} = 0\right) =1 \end{align}

But, the limit of $\frac{1}{n}$ as $n \to \infty$ is "obviously" $0$ and intuitively I would think that if the expectation of the numerator is finite then we can expect some finite number in the numerator. As $n$ runs off to $\infty$ it would just look like some finite number (it doesn't really matter what the number is so long as it is finite) being divided by a number growing larger and larger - approaching $\infty$. So, I would think the $\lim_{n\to \infty} \frac{X_n}{n}$ would behave similarly to $\lim_{n \to \infty} \frac{1}{n}$.

EDIT: but as pointed out in the comments, the limit of $X_n$ is not necessarily $0$ for all possible definitions of $X_n$.

Thank you.

BCLC
  • 13,459
David South
  • 2,508
  • 2
    I don't think that limit is zero so obviously. – Giuseppe Negro Nov 15 '15 at 20:55
  • @GiuseppeNegro why not? Maybe that will help me realize my mistake. – David South Nov 15 '15 at 20:56
  • Take for example the RV $X(\omega)=\omega^{-\frac14}$ on the probability space $[0, 1]$ with Lebesgue measure. The sequence $X/n$ converges to zero on $(0, 1]$ but does not for $\omega=0$, which of course is a set of null probability. Your argument, if true, would imply convergence for all $\omega$. – Giuseppe Negro Nov 15 '15 at 21:05
  • @GiuseppeNegro ahhh, okay this makes sense. A tad bit out of my experience with probability so far but it does make sense. – David South Nov 15 '15 at 21:10
  • 2
    If you want to use the strong law of large numbers, write ${X_n\over n}={S_n\over n}-{S_{n-1}\over n-1}({n-1\over n}).$ –  Nov 15 '15 at 21:10
  • 2
    "the limit of $\frac{X_n}{n}$ is obviously $0$ because of the $n$ in the denominator and because the $X_n's$ have finite mean and finite variance." 1. This is not obvious. 2. That you cite every hypothesis at your disposal as reasons for this result to hold, suggests that, actually, you are not quite sure of the reasons why it should hold. Do you? – Did Nov 15 '15 at 21:22
  • @Did, well that's exactly my point. I'm not sure. – David South Nov 15 '15 at 21:23
  • 2
    Then why are you writing sentences giving reasons at random? Sorry but I am not following the approach. – Did Nov 15 '15 at 21:24
  • @Did I edited my post to more accurately reflect what I'm thinking. Thanks for the advice. – David South Nov 15 '15 at 21:29
  • 2
    And now that you made your reasoning more explicit, one can see it is wrong: in many settings, actually as soon as the support of the common distribution of the $X_n$ is unbounded, the sequence $(X_n)$ is almost surely unbounded. So, a priori, $X_n/n$ not converging to $0$ is possible. That $X_n/n$ actually converges to $0$ requires a different argument. – Did Nov 15 '15 at 21:35
  • @Did, but doesn't the inclusion of $\mathbb{E}[X_n] < \infty$ imply that the support is bounded? http://math.stackexchange.com/questions/790106/does-finite-expectation-imply-bounded-random-variable – David South Nov 15 '15 at 21:39
  • No, this implies that ${\omega\in\Omega\mid X_n(\omega)<\infty}$ has full probability, not that ${X_n(\omega)\mid\omega\in\Omega}$ is bounded. Examples: $X_n$ exponential, or gaussian, or... – Did Nov 15 '15 at 21:42
  • @Did, so my problem was in my understanding of what the "support" of a RV is. It's a term I have never heard before, thanks! – David South Nov 15 '15 at 21:43

2 Answers2

4

$$\mathbb{E}\left(\sum_{n=1}^\infty Y_n^2\right)<\infty\Rightarrow \mathbb{P}\left(\sum_{n=1}^\infty Y_n^2<\infty\right)=1\Rightarrow \mathbb{P}\left(Y_n\to 0\right)=1.$$

  • So are you suggesting defining a new RV that is the sample mean of $Y_n's$? – David South Nov 15 '15 at 21:11
  • 1
    I am suggesting that you consider the random variable $\sum_{n=1}^\infty Y^2_n.$ –  Nov 15 '15 at 21:13
  • 1
    @DavidSouth The solution that I've started here has nothing to do with the strong law of large numbers. In fact, it does not need the random variables $(X_n)$ to be independent. –  Nov 15 '15 at 21:17
  • Ahhh, is this of a more "Borel-Cantelli" flavor? This lemma is something I have not learned yet but I have seen while reading a little further in my text. So, I would prefer to use another method that does not make use of a lemma I should not know at this point. – David South Nov 15 '15 at 21:21
  • 1
    @DavidSouth No, it is much easier and more direct than that. –  Nov 15 '15 at 21:23
  • Sorry Byron, but I don't understand the last "implication". Would you mind explaining how the probability of the sum being finite equalling $1$ implies $Y_n's$ converge almost surely? – David South Nov 16 '15 at 03:07
  • @DavidSouth http://math.stackexchange.com/questions/107961/if-a-series-converges-then-the-sequence-of-terms-converges-to-0 –  Nov 16 '15 at 03:43
  • Thank you so much. Makes absolute sense now! – David South Nov 16 '15 at 03:45
  • Byron, my question is why the first "implication" holds? – Student of Statistics Dec 01 '18 at 14:48
2

The result is still true even without the assumption of finite variance. Since $E|X|$ is finite, we have $E|X/\epsilon|$ is finite for all $\epsilon>0$. Using the layer cake formula, we get $$ \infty>E|X/\epsilon |=\int_0^\infty P(|X/\epsilon|>t)\,dt\ge\sum_{n=0}^\infty P(|X/\epsilon|>n)=\sum_n P(|X/n|>\epsilon) $$ Since the series $P(|X/n|>\epsilon)$ has a finite sum, the first Borel Cantelli lemma implies $P(|X_n/n|>\epsilon\text{ infinitely often})=0$. This holds for all $\epsilon>0$, which implies that $X_n/n\to 0$.

Mike Earnest
  • 75,930