0

Suppose $\mathbf{A}_{m\times m}$ is real symmetric and positive definite matrix. We employ SVD on A to get all orthogonal eigenvectors $\mathbf{u}_1,\cdots,\mathbf{u}_m$. Assume that $\mathbf{u}_1,\cdots,\mathbf{u}_k$ is the leading orthogonal eigenvectors regarding the leading eigenvalues. We make $\mathbf{U}=[\mathbf{u}_1,\cdots,\mathbf{u}_k]$. Could anyone prove the following formula: $$ \mathbf{U}\mathbf{U}^T=\sum_{i=1}^{k}\mathbf{u}_k\mathbf{u}_k^T=\mathbf{I}-\sum_{i=k+1}^{m}\mathbf{u}_k\mathbf{u}_k^T $$ I intuitively think that the above formula is false, but I cannot know how to prove it.

1 Answers1

1

Define the matrix $$ \tilde U = [U,U'] = [\overbrace{u_1,\dots,u_k}^U,\overbrace{u_{k+1},\dots,u_m}^{U'}] $$ Note that we can write $$ \begin{align} UU^T &= \pmatrix{U & U'} \overbrace{\pmatrix{I_{k} & 0\\0&0_{m-k}}}^J \pmatrix{U\\U'} = \tilde U J \tilde U^T\\ \sum_{i=k+1}^n u_iu_i^T &= U'(U')^T \\ & = \pmatrix{U & U'} \pmatrix{0_{k} & 0\\0&I_{m-k}} \pmatrix{U\\U'} = \tilde U (I_m - J) \tilde U^T \end{align} $$ From there, use the usual matrix multiplication (and the fact that $\tilde U \tilde U^T = I$) to verify that $$ \tilde U J \tilde U^T = I_m - \tilde U (I_m - J) \tilde U^T $$ which is exactly what we wanted to show.


Another way to look at this is to note that we can rewrite the identity matrix as $I = \sum_{i=1}^m u_i u_i^T$, then go through by manipulating sums rather than block matrices.

Ben Grossmann
  • 225,327