2

Given $X_1...X_n...$ i.i.d. random variables with $E|X_1|<\infty$ and $\sqrt{n}(\overline X_n - \mu) \rightarrow Z$ in law for a certain random variable Z and a certain real number $\mu$ (not necessarily equal to $EX_1$), I need to proof that $\overline X_n \rightarrow \mu$ a.s.

It is very similar to the central limit theorem but the problem is I don’t know if the variance is finite, so I cannot use it. Any hint on how can I approach this problem?

Suriya
  • 166
  • This is the law of large numbers. You thought too much. – Paresseux Nguyen Dec 29 '20 at 00:42
  • @ParesseuxNguyen for SLLN you need finite variance no? – jlammy Dec 29 '20 at 00:50
  • @jlammy You just need EX to exist (could be infinite) – Suriya Dec 29 '20 at 00:52
  • What is the property of $Z$? If this is CLT, then it requires that $X_i$ have finite variance, unless there is another flavor of CLT that I'm not aware of. – Daeyoung Dec 29 '20 at 00:54
  • @DaeyoungLim The exercise doesn’t give any property of Z... maybe the apparent CLT limit is just to trick? – Suriya Dec 29 '20 at 01:06
  • I'm not sure if this can be true without any conditions. Generally, convergence in distribution does not imply almost-sure convergence. – Daeyoung Dec 29 '20 at 01:18
  • Seems like deducing WLLN from a theorem similar to the CLT requires the CDF of $Z$ to be continuous everywhere. – Daeyoung Dec 29 '20 at 01:22
  • 1
    It's a strange question. That $\overline X_n \rightarrow \mu$ a.s. follows from the i.i.d. and $L^1$ assumptions (a special case of Ergodic SLLN). The convergence of $\sqrt{n}(\overline X_n - \mu)$ in law is extraneous/irrelevant (it does imply the weaker condition that $\overline X_n \rightarrow \mu$ in probability). – Michael Dec 29 '20 at 01:47
  • 1
    @Michael, no one said that $\mu = \mathbb{E}(X_{1})$. To put it a different way, the question seems to be: given that $\sqrt{n}(\overline{X}{n} - \mu) \to Z$, does $\mu = \mathbb{E}(X{1})$ follow? –  Dec 29 '20 at 01:54
  • 1
    @PeterMorfe Deterministic a.s limits are unique. Under the iid and $L^1$ assumptions, $\mu$ in the claim can only be the mean. The convergence in law condition is not relevant. – Michael Dec 29 '20 at 02:01
  • 1
    @Michael, we seem to be reading the question very differently. I posted an answer with a (hopefully) precise statement so let me know if I'm going wrong somewhere. –  Dec 29 '20 at 02:30
  • Elaborating on what @Michael said, we already know from SLLN that $\overline{X}_n \to \mathbb{E}X_1$ a.s. and that the limit is (almost surely) unique. That is, whenever $X_n \to X$ and $X_n \to Y$, then $X=Y$ a.s. so in this case, without the convergence in law, it can only be $\mu = \mathbb{E}X_1$. – Daeyoung Dec 29 '20 at 19:11
  • @DaeyoungLim But you don’t know if $\overline X_n$ converges to $\mu$ – Suriya Dec 29 '20 at 19:14
  • Oh I see what you mean. In that case, @Botnakov deduces convergence in probability. The almost sure uniqueness of limits also applies for when $X_n \to X$ in probability and $X_n \to Y$ in probability. Check this answer out. – Daeyoung Dec 29 '20 at 19:18
  • @DaeyoungLim I should have used another letter instead of $\mu$, it is a bit confusing I admit. Thanks for the link! – Suriya Dec 29 '20 at 19:20

2 Answers2

3

As $\sqrt{n}(\overline X_n - \mu) \rightarrow Z$ in distribution and $\frac{1}{\sqrt{n}} \to 0$ hence by Slutsky's theorem we have $\frac{1}{\sqrt{n}}\sqrt{n}(\overline X_n - \mu) \rightarrow Z \cdot 0$ and $\overline X_n - \mu \to 0$ in distribution. As it is convergence in distribution to $const$ then $\overline X_n - \mu \to 0$ in probability. Hence $\overline X_n \to \mu $ in probability.

From S.L.L.N. we have $\overline X_n \to EX_1$ a.s. Thus $\overline X_n \to EX_1$ in probability.

As $\overline X_n \to \mu $ in probability and $\overline X_n \to EX_1$ in probability we get that $\mu = EX_1$.

As $\overline X_n \to EX_1$ a.s. and $\mu = EX_1$. we get that $\overline X_n \to \mu$ a.s.

Botnakov N.
  • 5,660
  • It's worth observing that if $Z_{n} \to Z$ in distribution, then the convergence in distribution $n^{-\frac{1}{2}} Z_{n} \to 0$ can be checked directly from the definition. –  Dec 29 '20 at 23:42
2

I'm working under the impression that $\overline{X}_{n} = n^{-1} \sum_{j = 1}^{n} X_{j}$.

Claim: Assume $\{X_{n}\}_{n \in \mathbb{N}}$ are i.i.d. random variables, $\mathbb{E}(|X_{1}|) < \infty$, and $Z$ is a random variable with $\mathbb{P}\{|Z| < \infty\} = 1$. Let $\mu \in \mathbb{R}$. If $\sqrt{n}(\overline{X}_{n} - \mu) \to Z$ in law, then $\mu = \mathbb{E}(X_{1})$.

As a consequence of the claim, the strong law of large numbers implies $\overline{X}_{n} \to \mu$.

Proof of the claim: By the strong law of large numbers and Egoroff's Theorem, there is an $\omega : [0,\infty) \to [0,\infty)$ and an event $\mathcal{C}$ such that $\mathbb{P}(\mathcal{C}) \geq 1/2$, $\lim_{\delta \to 0^{+}} \omega(\delta) = 0$, and \begin{equation*} |\overline{X}_{n} - \mathbb{E}(X_{1})| \leq \omega(n^{-1}) \quad \text{on} \, \, \mathcal{C}. \end{equation*}

At the same time, since $\mathbb{P}\{Z < \infty\} = 1$, Prokhorov's Theorem implies that there is an $M > 0$ such that \begin{equation*} \mathbb{P}\{|\sqrt{n}(\overline{X}_{n} - \mu)| \geq M \} \leq 1/4. \end{equation*}

It follows that, for each $n \in \mathbb{N}$, $\mathbb{P}(\{|\sqrt{n}(\overline{X}_{n} - \mu)| < M\} \cap \mathcal{C}) > 0$. On that event, we have \begin{equation*} \sqrt{n}(\mathbb{E}(X_{1}) - \mu) \leq \sqrt{n} \omega(n^{-1}) + \sqrt{n}(\overline{X}_{n} - \mu) \leq \sqrt{n} \omega(n^{-1}) + M. \end{equation*}

Since the extreme sides of the inequality are constants, we deduce that, for each $n \in \mathbb{N}$, \begin{equation*} \mathbb{E}(X_{1}) - \mu \leq \omega(n^{-1}) + M n^{-\frac{1}{2}}. \end{equation*} Sending $n \to \infty$, we find $\mathbb{E}(X_{1}) \leq \mu$.

A similar argument shows that $\mathbb{E}(X_{1}) \geq \mu$.

  • Thanks for the response! The theorems you used are more advanced than what we studied in the subject so I'm wandering if the teacher was mistaken giving us that exercise. I'll accept the answer and try to see if I can adapt your argument. – Suriya Dec 29 '20 at 10:12
  • 1
    @SaudiBombsYemen, try checking the case when $\overline{X}{n} = \mathbb{E}(X{1})$ directly. (This would happen if $X_{n} = \mathbb{E}(X_{1})$ for each $n$.) Most of the argument I gave is "legalese" necessary to treat the random case. I know at least one probabilist who probably would not have written the details and said "That's obvious" as an answer... –  Dec 29 '20 at 15:56
  • Thanks! I think I will accept the other answer I got because it uses simpler arguments. Anyway, I’ll keep in mind your answer since I think it gives a deeper understanding of whats is going on. – Suriya Dec 29 '20 at 19:10