1

Let $X_n$ denote a sequence of i.i.d. random variables and $Y_n=X_n/n$.

$Y_n$ converges pointwise towards Y if

$~~~{\displaystyle \lim _{n\to \infty }Y_{n}(\omega )=Y(\omega ),\,\,\forall \omega \in \Omega}$,

where $\Omega$ is the sample space of the underlying probability space over which the random variables are defined.

It seems trivial that $\lim_{n\to\infty}X_n(\omega)/n=0$, $\forall \omega \in \Omega$ and i.i.d. distributions of $X_n$. Then, also all other kinds of convergence are true. What do I miss?

I'm confused by the comments of experienced guys in this post: Proving almost sure convergence

Its_me
  • 341
  • It is not trivial that $\lim_{n\to\infty} X_n(\omega)/n$. Can you tell use more about why you think it is trivial? What would be the $\epsilon-\delta$ proof? If you make your reasoning more clear, we can help better explain where it went wrong. – Mike Earnest Mar 15 '21 at 16:17
  • I seems true that for all $\omega$ and $\varepsilon>0$ there's a fixed $N$ so that $|X_n(\omega)/n - X_m(\omega)/m|<\varepsilon$. As $X_n(\omega)=X_m(\omega)$ and these are real finite numbers I can choose for any $\omega$ the constant $N$ large enough so that the condition for convergence is true. So, one of my assumptions must be wrong, but which one? – Its_me Mar 15 '21 at 16:29
  • Why is $X_n(\omega) = X_m(\omega)$? This seems to violate your assumption that $X_n$ and $X_m$ are independent. – user6247850 Mar 15 '21 at 16:31
  • I see, so the definition of pointwise convergence I state above applies only to sequences that are functions of the same $\omega$, and there's no additional randomness between the $X_n$ allowed. I think my misunderstanding was, that altough the $X_n$ are independent the equation is true for the same outcome $\omega$. – Its_me Mar 15 '21 at 16:36
  • 1
    Your definition of pointwise convergence is correct, we do need $\lim_{n \ra \infty} X_n(\omega)/n = 0$ for all $\omega$, it's just that in general $X_n(\omega) \ne X_m(\omega)$. In fact, this is what allows there to be additional randomness between the $X_n$. – user6247850 Mar 15 '21 at 16:45
  • Not sure but $X_n = n$ ? – BCLC Jan 26 '23 at 21:55
  • @BCLC Sorry, "... i.i.d. distributions of $X_n$" should be "... i.i.d. $X_n$", which rules out any dependence on $n$. – Its_me Jan 27 '23 at 19:23

1 Answers1

2

Here is an example for iid variables where $X_n(\omega)/n$ does not converge to zero.

Suppose that $X_n$ are iid copies of the St. Petersburg random variable. That is, you repeatedly flip a coin until the result is heads, and let $X=2^{\text{# flips}}$.

I claim that with probability one, $X_n(\omega)/n$ will not converge to zero. To show this, I will show that $X_n(\omega)/n\ge 1$ will occur infinitely often with a probability of one.

For any $n$ in the range $2^k$ to $2^{k+1}-1$, we have $$ P(X_n/n\ge 1)=P(X_n\ge n)= P(X_n\ge 2^k)=P(\hbox{# flips}\ge k)=2^{-(k-1)} $$ Therefore, the probability that all of the numbers $X_n/n$ in the range $2^k$ to $2^{k+1}-1$ are less than one is at most $$ P\left(\bigcap_{n=2^k}^{2^{k+1}-1}\{X_n/n<1\}\right)\le (1-2^{-(k-1)})^{2^k}=\left(1-\frac{2}{2^k}\right)^{2^k}\approx e^{-2}. $$ so the probability that at least one of the $X_n/n\ge 1$ occurs is $1-e^{-2}\approx 86\%$. Therefore, in each of then intervals $[2^k,2^{k+1})$, there is at least an $86\%$ chance that there will be a variable $X_n/n\ge 1$. This these events are all independent, in the long run, it will occur infinitely many times.


It turns out that the same thing will happen whenever $X$ has infinite expectation. The intuition is that having a large expectation increases the chance of $X$ being large, so it turns out infinite expectation implies $X_n$ will actually be larger than $n$ infinitely often.

Mike Earnest
  • 75,930