Let $(X_n)$ is a sequence of random variables with $X_n\sim\mathsf{N}(\mu_n,\sigma^2_n)$ that converges in law to $X$, then $X\sim\mathsf{N}(\mu,\sigma^2)$, where $\mu=\lim\limits_{n\rightarrow\infty}\mu_n$ and $\sigma^2=\lim\limits_{n\rightarrow\infty}\sigma^2_n$.
There are quite a few posts relating to the above theorem such as the one here and here and while they do go into quite some detail, my analysis is quite rusty and I still have some big gaps in understanding, so would really appreciate it if anyone could help me addressing the issues I have. Thanks in advance.
Let $\varphi(.)$ denote the characteristic function of a r.v. So convergence in law then implies that $\varphi_{X_n}(t)=e^{it\mu-\frac{1}{2}t^2\sigma_n^2}\overset{n}{\underset{\infty}{\longrightarrow}}\varphi_X(t)$. Therefore $|\varphi_{X_n}(t)|=e^{-\frac{1}{2}t^2\sigma_n^2}\overset{n}{\underset{\infty}{\rightarrow}}|\varphi_X(t)|$ and from this we need to infer that $(\sigma_n^2)$ and $(\mu_n)$ are convergent sequences.
I fully understand why $(X_n)$ is a tight sequence (Lévy’s continuity theorem) and why tightness implies that $(\sigma_n^2)$ and $(\mu_n)$ are bounded sequences.
However I have seen in some texts where for the $(\sigma_n^2)$, tightness is not used and it is just said that $(\sigma_n^2)$ has to be convergent as the limit has to be continuous, so my first issue is understanding this argument, as it is implied to be quite obvious but I don't see it, so below I outline my understanding.
- Continuity Argument: As characteristic functions are (uniformly) continuous hence, are non-vanishing in a region around zero (as $\varphi_X(0)=1$) then $(\sigma_n^2)$ has to be bounded otherwise $\limsup_n \sigma^2_n=\infty$ would imply $|\varphi_{X_n}(t)|\rightarrow 0$ in a region where $\varphi_{X}(t)\neq 0$. This then permits us to apply logs and for any convergent subsequence $(\sigma_{n_k}^2)$, we have $\sigma_{n_k}^2\overset{n}{\underset{\infty}{\rightarrow}}-2\ln|\phi_{X}(1)|$, which then implies $\sigma_{n}^2\overset{n}{\underset{\infty}{\rightarrow}}-2\ln|\phi_{X}(1)|=\sigma^2$ Is this correct, I am a not very confident if my reasoning on why it is safe to apply logs, i.e. why $\varphi_X\neq 0$ is correct.
Now by tightness $(\mu_n)$ must be bounded. Then for any convergent subsequences with limit $\mu$, we have $$e^{it\mu_{n_k}}\overset{n}{\underset{\infty}{\longrightarrow}}e^{\frac{1}{2}t^2\sigma^2}\varphi_X(t)=e^{it\mu}.$$ So if $(\mu_{n_{k'}})$ is another subsequence with limit $\mu'$ then we have $e^{i(\mu-\mu')}=1$, thus $\mu-\mu'=2\pi p$, $p\in\mathbb{Z}$ so how do we infer that $p=0$. I know in 1 a choice of $t$ is chosen so that we can be in the domain of the principal Log function, but I'm still a bit hazy on how this uniquely determines $\mu$. Is it beacuse this has to be true for all $t$, then for that carefully chosen choice of $t$ which allows us the apply the principal log forces that value of $mu$ to be uniquely determined? So any further explanation is greatly needed.
I really don't know why $\sigma^2\neq 0$ As $\sigma^2=0$ iff $|\phi_X(1)|=1$, so why is it that $|\phi_X(1)|\neq1$?
Any feedback is welcomed, thanks.
To verify your 1st claim. Let $f(t)=t-\sin t$. Then $f'(t)\geq 0$, so as $f(0)=0$ we have $f(t)\geq 0$, and so for any $t>0$, $1-\frac{\sin t}{t}\geq 0$, which then holds for $t<0$ as $1-\frac{\sin t}{t}$ is even.
– user152874 Oct 05 '19 at 08:43So for a $\epsilon>0$ and for large enough $n$, $1-\frac{\sin x_n}{x_n}=|1-\frac{\sin x_n}{x_n}|<\epsilon$, which then implies for $x_n>0$ that $1-\epsilon<\frac{\sin x_n}{x_n}<\frac{1}{x_n}$, i.e. $0<x_n<\frac{1}{1-\epsilon}$. And am stuck here, as this works only if $x_n>0$ and $\epsilon>1$. Tried playing around with $x_n<0$ (so $-\frac{\sin x_n}{x_n}\leq -\frac{1}{x_n})$ and $-\epsilon<1-\frac{\sin x_n}{x_n}$.
– user152874 Oct 05 '19 at 08:44