Consider a sequence $X_n$ of Gaussian random variables with mean $\mu_n$ and variance $\sigma_n^2$, which converges in distribution (to some limiting distribution). Can I then conclude that $\mu_n$ converges to some $\mu$ and $\sigma_n^2$ converges to some $\sigma^2$, with the limiting distribution being $N(\mu, \sigma^2)$? Here, I think I'd allow the degenerate Gaussian distribution with zero variance.
I've tried looking this up elsewhere on this site, but all those I've found assume almost sure/L2 convergence. I was thinking this should be true, and was trying to do this by characteristic functions but without assuming the convergence of the two parameters, I can't seem to conclude anything.
http://www.math.ku.dk/noter/filer/vidsand12.pdf
Essentially, the proof works by approximating and indicator function with a continuous (here, even uniformly continuous) and bounded mapping.
– Alexander Sokol Feb 15 '15 at 11:30