1

When the random variable $\{X_n,n\ge1\}$ satisfies the uniformly bounded condition, why does $$ \frac{1}{n^2}\operatorname{Var}\left(\sum_{k=1}^{n}X_k\right)\rightarrow0 $$ become a necessary and sufficient condition for the establishment of the weak law of large numbers? From Chebyshev's law of large numbers, we can easily use the above conditions to deduce that the law of large numbers holds. So why is this condition necessary?

Davide Giraudo
  • 172,925
mathhahaha
  • 352
  • 1
  • 7
  • When ${X_n}$ are i.i.d. $$ n^{-1}\operatorname{Var}!\left(\sum_{i=1}^n X_i\right)=\operatorname{Var}(X_1). $$ The average "converges" to $0$ if $X_1$ is a constant. –  Nov 22 '21 at 16:18
  • It is $\text{Var}\left(\frac{1}{n}\sum\limits_{k=1}^{n}X_k\right)\to0$ not $\frac{1}{n}\text{Var}\left(\sum\limits_{k=1}^{n}X_k\right)$ – Henry Nov 22 '21 at 16:33
  • If $|X|\le c$ then $\text{Var}(X) \le c^2$ – Henry Nov 22 '21 at 16:38
  • @Henry thanks! If ${X_n}$ are not i.i.d., why does $Var(\frac{1}{n}\sum_{k=1}^{n}X_k)\rightarrow 0$ still hold true? – mathhahaha Nov 23 '21 at 01:44
  • It need not (you deleted "independent and identically distributed" after my comment). Suppose all the $X_k$ are equal to each other: then $\text{Var}\left(\frac{1}{n}\sum\limits_{k=1}^{n}X_k\right) = \text{Var}(X_i)$ while $\frac{1}{n}\text{Var}\left(\sum\limits_{k=1}^{n}X_k\right)=n\text{Var}(X_i)$ – Henry Nov 23 '21 at 01:53
  • @Henry : If your comments are intended to answer the question, you should post them as answers. In general, question-and-answer pages should be "self-contained" in that they should not require reading comments for essential information"; questions should not be answered in comments. – Peter O. Nov 23 '21 at 06:04
  • @PeterO. My comments have not answered this question. In any case there are a group of users who react negatively to any answers to short questions. Rather than provoke them, I often prefer to use the comments. – Henry Nov 23 '21 at 09:03

1 Answers1

0

Suppose that $(X_n)$ are uniformly bounded and let $Y_n:=\sum_{i=1}^n (X_i-\mathbb E[X_i])/n$. Then $(Y_n)_{n\geqslant 1}$ is also uniformly bounded and so is $(Y_n^2)_{n\geqslant 1}$. In particular, $(Y_n^2)$ is uniformly integrable.

The weak law of large numbers is, by definition, the convergence in probability of $Y_n\to 0$, which is equivalent to $Y_n^2\to 0$ in probability.

Now we use the following fact: convergence in $\mathbb L^1$ of $(Y_n^2)$ to $0$ is equivalent to (convergence in probability of $(Y_n^2)_n$ to $0$ and uniform integrability of $(Y_n^2)_n$), see here.

An other way to see this is that if $\lvert X_n\rvert\leqslant C$ almost surely, then letting $Y_n:=\sum_{i=1}^n (X_i-\mathbb E[X_i])/n$ , we have $\lvert Y_n\rvert\leqslant C$. If $Y_n\to 0$ in probability, then $$ \operatorname{Var}(Y_n)\leqslant \mathbb E\left[Y_n^2\right]=\mathbb E\left[Y_n^2\mathbf{1}_{\{\lvert Y_n\rvert>\delta\}}\right]+\mathbb E\left[Y_n^2\mathbf{1}_{\{\lvert Y_n\rvert\leqslant\delta\}}\right] \leqslant C^2\mathbb P\left(\lvert Y_n\rvert>\delta\right)+\delta^2. $$

Davide Giraudo
  • 172,925