3

Let $(X_n)_n$ be a gaussian random vector taking values in $\mathbb{R}^d,$ let $K_{X_n}$ denote the covariance matrix of $X_n.$ Show that if $(X_n)_n$ converges in distribution to $X,$ then $(K_{X_n})_n$ and $(E[X_n])_n$ converge. Find the distribution of $X.$

So in terms of characteristic function we have $$\forall x \in \mathbb{R}^d, \quad\lim_n\varphi_{X_n}(x)=e^{i \ ^tx\operatorname E[X_n]-\frac{1}{2}\,^txK_{X_n}x}=\varphi_X(x),$$ how can we use it in order to prove the convergence of $(K_{X_n})_n$ and $(\operatorname E[X_n])_n.$

3 Answers3

5

Step One: Since $(X_n)_n$ converges in distribution, it is tight. Hence, there exists $M>0$ such that $P(X_n\in[-M,M]^d)>\frac23$ for all $n$. I claim that $E[X_n] \in [-2M,2M]^d$ for all $n$. Indeed, suppose $E[X_n] \notin [-2M,2M]^d$. Then there exists $i$ such that $e_i^TE[X_n]>2M$ or $e_i^TE[X_n]<-2M$, where $\{e_i\}$ is the standard basis of $\mathbb R^d$. Assume without loss of generality that $\mu_{n,i}:=e_i^TE[X_n]>2M$. Since $e_i^TX_n$ is a $1$-dimensional Gaussian random variable, one has $P(e_i^TX_n>\mu_{n,i})=\frac12$. But then we have $$\frac12=P(e_i^TX_n>\mu_{n,i}) \le P(e_i^TX_n > 2M) \le P(X_n \notin [-M,M]^d) < \frac13,$$ a contradiction. This shows the sequence $(E[X_n])_n$ is bounded.

Step Two: If $E[X_{n_k}] \to \mu$ for some subsequence $\{n_k\}$, then

$$\varphi_X(x)=\lim_{k\to\infty} \exp\Big(ix^T E[X_{n_k}] - \tfrac12 x^TK_{X_{n_k}}x \Big) = \exp\left(ix^T\mu - \tfrac12\lim_{k\to\infty}x^TK_{X_{n_k}}x\right)$$

by continuity of the exponential function. In particular, this implies $K_{X_{n_k}} \to K$ for some symmetric, non-negative definite matrix $K$. However, plugging this back into the above equation, this shows

$$\varphi_X(x) = \exp\Big(ix^T\mu - \frac12 x^TKx\Big),$$

which means $X$ is a Gaussian vector with mean $\mu$ and covariance matrix $K$.

Step Three: We have shown that if $E[X_{n_k}] \to \mu$, then $\mu = E[X]$ and $K_{X_{n_k}} \to K_X$, the covariance matrix of the Gaussian vector $X$. Suppose now for a contradiction that $E[X_n] \not\to E[X]$. Then there exists $\epsilon>0$ and a subsequence $\{n_k\}$ such that $\|E[X_{n_k}] - E[X]\| \ge \epsilon$. But $(E[X_n])_n$ is bounded, so there exists a further subsequence $\{n_{k_j}\}$ such that $E[X_{n_{k_j}}]$ converges, and from above the above limit must be $E[X]$, which is a contradiction. If we instead suppose $K_{X_n} \not\to K_X$, then you can repeat these steps except with $\|K_{X_{n_k}} - K_X\| \ge \epsilon$ instead, and then since $E[X_{n_{k_j}}]\to E[X]$ we have $K_{X_{n_{k_j}}} \to K_X$, a contradiction.

To summarize,

  • $X$ is Gaussian,
  • $E[X_n]\to E[X]$, and
  • $K_{X_n} \to K_X$ (the covariance matrix of $X$).
Jason
  • 15,438
  • In the step two, you mentioned that by continuity of the exponential function, the limit of $K_{X_{n_k}}$ exists, did you use the fact that $K_{X_n}$ is symmetric ? –  Apr 14 '20 at 21:34
  • Since $K_{X_n}$ is symmetric for each $n$, so is $\lim_{n\to\infty}K_{X_n}$. – Jason Apr 14 '20 at 21:38
  • I mean why the limit exists –  Apr 14 '20 at 21:43
  • 1
    If $x^TA_nx$ converges for every $x$, then $A_n$ converges. To see this, note that the $(i,j)$-th entry of $A$ is $e_i^TAe_j$, and $x^TAy = \frac14((x+y)^TA(x+y) - (x-y)^TA(x-y))$. – Jason Apr 14 '20 at 21:46
  • Now I understand, it seems @mathex also used this argument. I appreciate your help. –  Apr 14 '20 at 21:53
2

Interesting question, of course there are several ways to solve the problem, one of them is the following :

Let $||.||$ be a norm on $\mathbb{R}^d$.

First, we will prove that $(K_{X_n})_n$ converges, we can prove this by verifying that each element $k_n^{i,j}$ in $K_{X_n}$ converges, and since $K_{X_n}$ is symmetric we have $$\forall x,y\in \mathbb{R}^d, \ ^txK_{X_n}y= \frac{1}{4}(\ ^t(x+y)K_{X_n}(x+y)-\ ^t(x-y)K_{X_n}(x-y)),$$ so it's sufficient to prove that for all $x \in \mathbb{R}^d, (\ ^txK_{X_n}x)_n$ converges. We can find $\eta>0$ such that for all $x \in \mathbb{R}^d,||x||\leq\eta \implies\ |\varphi_X(\eta)|\geq\frac{1}{2}$.(Since $\varphi_X$ is continuous at $0$)

Take $x \in \mathbb{R}^d.$ Since $\frac{||\eta x||}{||x||+1}\leq\eta,$ then we have $\lim_n|\varphi_{X_n}(\frac{x\eta}{||x||+1})|=\lim_ne^{-\frac{1}{2(||x||+1)^2}\eta^2 \ ^txK_{X_n}x}=|\varphi_X(\frac{\eta x}{||x||+1})|,$ which means that $\lim_n \ ^txK_{X_n}x=-\frac{2(||x||+1)^2}{\eta^2}\ln(|\varphi_X(\frac{\eta x}{||x||+1})|),$ we conclude that each element $k^{i,j}_n$ in $K_{X_n}$ converges, so let $K:=\lim_nK_{{X_n}}.$

Now we will prove that $(E[X_n])_n$ is convergent. As above we will work element by element, but first notice that $\forall x \in \mathbb{R}^d,e^{i \ ^txE[X_n]}=\varphi_{X_n}(x)e^{\frac{1}{2} \ ^txK_{X_n}x},$ then $\lim_n\forall x \in \mathbb{R}^d,e^{i \ ^txE[X_n]}=\varphi_{X}(x)e^{\frac{1}{2} \ ^txKx}:=h(x).$

$h$ is continuous at $0,$ which mean that $\exists \delta>0,\forall x \in \mathbb{R}^d,||x||\leq \delta \implies |h(x)|\geq\frac{1}{2}.$

Let $x \in \mathbb{R}^d$, $$\forall 0<y<\delta,\frac{||yx||}{||x||+1}\leq\delta,$$ then we will have $$\int_{0}^{\delta}h(\frac{yx}{||x||+1})dy\geq \frac{\delta}{2}>0,$$ and finally since $$\frac{i \ ^txE[X_n]}{||x||+1}\int_0^\delta exp(\frac{iy \ ^txE[X_n]}{||x||+1})dy=exp(\frac{i\delta \ ^txE[X_n]}{||x||+1})-1,$$ by dominated convergence theorem we have $$\lim_n\int_0^\delta exp(\frac{iy \ ^txE[X_n]}{||x||+1})dy=\int_0^\delta h(\frac{yx}{||x||+1}) \neq0$$ we deduce the convergence of $^txE[X_n],$ and of course $X$ is a gaussian vector with mean $\lim_n E[X_n]$ and covariance matrix $K$

mathex
  • 662
2

First observe that the vector $X$ is necessarily Gaussian: indeed, if $c_1,\dots,c_d$ are constant, the sequence of random variables $(Y_n)$ defined by $Y_n:=\sum_{i=1}^dc_iX_n^{(i)}$ converges to $\sum_{i=1}^dc_iX^{(i)}$ and the limit of a sequence of Gaussian random variables is Gaussian. Moreover, we know that if $Y_n\sim N(\mu_n,\sigma_n^2)$ and $(Y_n)$ converges in distribution to $Y$ then there exists $\mu$ and $\sigma$ such that $\mu_n\to\mu$ and $\sigma_n^2\to \sigma^2$.

Then we have to show that $E[X_n]$ converges to $E[X]$ and $E[X_n^{(i)}X_n^{(j)}]\to E[X^{(i)}X^{(j)}] $. This can be seen by using the convergence in distribution of $\left(X_n^{(i)}\right)$ to $X^{(i)}$ for the first part and for the second we use the convergences in distribution of $X_n^{(i)}-X_n^{(j)}$ to $X^{(i)}-X^{(j)}$, of $X_n^{(i)}+X_n^{(j)}$ to $X^{(i)}+X^{(j)}$ and $X_n^{(i)}\to X^{(i)}$, $X_n^{(j)}\to X^{(j)}$, combined with the convergence of the corresponding variances.

Davide Giraudo
  • 172,925