1

Let $(H_n)$ and $(\Lambda_n)$ be sequences of $n\times k$ matrices respectively (growing number of rows), where $k\in\mathbb{N}$ is fixed. We make the following two assumptions:

  1. $H'_nH_n \to I_k$ as $n\to \infty$.

  2. $\Lambda'_n\Lambda_n \to \Sigma$ as $n\to \infty$, where $\Sigma$ is a positive definite matrix with distinct eigenvalues.

Here convergence is with respect to the Frobenius norm, which I denote $||\cdot||$.

I want to show that the largest $k$ eigenvalues of the matrix $H_n \Lambda'_n \Lambda_n H'_n$ converge to those of $\Sigma$ (the rank of $H_n \Lambda'_n \Lambda_n H'_n$ is at most $k$, so all other eigenvalues must be zero).

Any ideas on how to proceed is greatly appreciated.

Alphie
  • 4,740

1 Answers1

2

This one is surprisingly easy, as long as the $k$ leading eigenvalues are non-zero.

It is well known that the eigenvalues of an $n \times n$ matrix depend continuously on the entries of that matrix. With that said, we have $$ \vec \lambda^{\downarrow k}(H_n \Lambda'_n \Lambda_n H_n') = \vec \lambda^{\downarrow}(\Lambda_n' \Lambda_nH_n'H_n) $$ as a consequence of the fact that $H_n \Lambda'_n \Lambda_n H_n'$ is positive semidefinite and the fact that $AB$ and $BA$ have the same non-zero eigenvalues (for any matrices $AB$ such that both $AB$ and $BA$ are square).

It follows that $$ \lim_{n \to \infty}\vec \lambda^{\downarrow k}(H_n \Lambda'_n \Lambda_n H_n') = \\ \lim_{n \to \infty}\vec \lambda^{\downarrow}(\Lambda_n' \Lambda_nH_n'H_n) = \\ \vec \lambda^{\downarrow}\left(\lim_{n \to \infty} \Lambda_n'\Lambda_n H_n' H_n \right) = \\ \vec \lambda^{\downarrow}\left([\lim_{n \to \infty} \Lambda_n'\Lambda_n] [\lim_{n \to \infty}H_n' H_n] \right) = \\ \vec \lambda^{\downarrow}(\Sigma I_k) = \vec \lambda^{\downarrow}(\Sigma), $$ which is what we wanted.

Ben Grossmann
  • 225,327
  • Very nice! The continuity of eigenvalues you invoke is for the space of symmetric $n\times n$ real matrices right? (for non symmetric real matrices eigenvalues may not be real). Am I right in saying that a sufficient condition for the $k$ eigenvalues of $\Lambda_n' \Lambda_nH_n'H_n$ to be nonzero is that both $\Lambda_n$ and $H_n$ have rank equal to $k$? – Alphie Dec 07 '20 at 23:48
  • It suffices to invoke the statement for symmetric real matrices, but continuity applies even when the eigenvalues aren't necessarily real (of course, we would no longer be able to present them in decreasing order). And yes, that is indeed a sufficient condition: if $A$ has full column-rank, then $A'A$ is invertible. – Ben Grossmann Dec 07 '20 at 23:51
  • Thanks a lot. In my original set-up the matrices $\Lambda_n$ and $H_n$ are random, but I guess the argument remains the same using the continuous mapping theorem. The requirement that $\Lambda_n, H_n$ have rank $k$ for $n$ large seems pretty mild no? – Alphie Dec 07 '20 at 23:57
  • @Alphie Yes, I'd say it's pretty mild indeed. For a typical distribution (e.g. multivariate normal), you should get rank $k$ with probability $1$ – Ben Grossmann Dec 08 '20 at 00:11
  • Actually, can we use continuity of the determinant to obtain $det(H_n'H_n)\to 1$ and $ det(\Lambda_n'\Lambda_n)\to c>0$, so that full column rank for $n$ large follows? – Alphie Dec 08 '20 at 00:20
  • Oh yes, if $\Sigma$ is invertible then we can guarantee that $\Lambda_n'\Lambda_n$ is invertible for all but finitely many $n$, and $H_n'H_n$ is necessarily invertible for all but finitely many $n$. – Ben Grossmann Dec 08 '20 at 00:55
  • Here we use continuity on the space of real symmetric $k\times k$ matrices. But can I invoke contiuity when the size of the matrix is also growing like here: https://math.stackexchange.com/q/3939979/522332? Thanks a lot for your help. – Alphie Dec 08 '20 at 15:40
  • I reread your proof and don't see why the assumption of $k$ leading eigenvalues be nonzero is needed. I feel the equality $\vec \lambda^{\downarrow k}(H_n \Lambda'_n \Lambda_n H_n') = \vec \lambda^{\downarrow}(\Lambda_n' \Lambda_nH_n'H_n)$ holds unrestrictedly (since all eigenvalues involved are nonnegative). I also feel like the assumption of distinct eigenvalues is not necessary, because according to https://math.stackexchange.com/q/332674/522332 $AB$ and $BA$ also share the same multiplicity of nonzero eigenvalues. Is this right? – Alphie Dec 11 '20 at 14:23
  • @Alphie I think I was being overly cautious with the non-zero eigenvalues since $AB$ and $BA$ share non-zero eigenvalues. I agree, the assumption is unnecessary. I never assume that the eigenvalues are distinct (only that the eigenvalues are listed multiple times according to their multiplicity). So yes, an assumption of distinct eigenvalues would be unnecessary – Ben Grossmann Dec 11 '20 at 16:07