Suppose $\mathbf{A}_{m\times m}$ is real symmetric and positive definite matrix. We employ SVD on A to get all orthogonal eigenvectors $\mathbf{u}_1,\cdots,\mathbf{u}_m$. Assume that $\mathbf{u}_1,\cdots,\mathbf{u}_k$ is the leading orthogonal eigenvectors regarding the leading eigenvalues. We make $\mathbf{U}=[\mathbf{u}_1,\cdots,\mathbf{u}_k]$. Could anyone prove the following formula: $$ \mathbf{U}\mathbf{U}^T=\sum_{i=1}^{k}\mathbf{u}_k\mathbf{u}_k^T=\mathbf{I}-\sum_{i=k+1}^{m}\mathbf{u}_k\mathbf{u}_k^T $$ I intuitively think that the above formula is false, but I cannot know how to prove it.
-
I know that $\mathbf{U}^T\mathbf{U}=\mathbf{I}_k$. – Dajiang Lei Feb 27 '15 at 12:56
1 Answers
Define the matrix $$ \tilde U = [U,U'] = [\overbrace{u_1,\dots,u_k}^U,\overbrace{u_{k+1},\dots,u_m}^{U'}] $$ Note that we can write $$ \begin{align} UU^T &= \pmatrix{U & U'} \overbrace{\pmatrix{I_{k} & 0\\0&0_{m-k}}}^J \pmatrix{U\\U'} = \tilde U J \tilde U^T\\ \sum_{i=k+1}^n u_iu_i^T &= U'(U')^T \\ & = \pmatrix{U & U'} \pmatrix{0_{k} & 0\\0&I_{m-k}} \pmatrix{U\\U'} = \tilde U (I_m - J) \tilde U^T \end{align} $$ From there, use the usual matrix multiplication (and the fact that $\tilde U \tilde U^T = I$) to verify that $$ \tilde U J \tilde U^T = I_m - \tilde U (I_m - J) \tilde U^T $$ which is exactly what we wanted to show.
Another way to look at this is to note that we can rewrite the identity matrix as $I = \sum_{i=1}^m u_i u_i^T$, then go through by manipulating sums rather than block matrices.

- 225,327
-
Do you mean that $UU^T=I_m-\tilde{U}(I_m-J)\tilde{U}^T=I_m-\tilde{U}\tilde{U}^T+\tilde{U}J\tilde{U}^T$? So the equation $\mathbf{U}\mathbf{U}^T=\sum_{i=1}^{k}\mathbf{u}k\mathbf{u}_k^T=\mathbf{I}-\sum{i=k+1}^{m}\mathbf{u}_k\mathbf{u}_k^T$ is false. – Dajiang Lei Feb 27 '15 at 13:36
-
-
-
-
What do you mean $\tilde U \tilde U^T \neq I$? The columns are orthonormal, so the matrix is orthogonal. – Ben Grossmann Feb 27 '15 at 13:47
-
-
-
And I'm telling you that you're wrong. One implies the other since $\tilde U$ is square. – Ben Grossmann Feb 27 '15 at 13:52
-
-
-
I cannot say "thank you" enough. But I still have a problem, whether $\tilde{U}=[u_1,u_2,\cdots,u_m]$ is a orthogonal matrix? In linear algebra, an orthogonal matrix is a square matrix with real entries whose columns and rows are orthogonal unit vectors. I wonder whehter the eigenvectors matrix $\tilde{U}$ satisfy the above properties. – Dajiang Lei Feb 27 '15 at 14:14
-
@DajiangLei the following theorem is often given directly after the definition of orthogonality: a square matrix $A$ is orthogonal $\iff$ $A$ has orthonormal columns $\iff$ $A$ has orthonormal rows $\iff A^{-1} = A^T$. – Ben Grossmann Feb 27 '15 at 14:17
-
Thank you very much! I think I will conducte an experiment to prove the formula. – Dajiang Lei Feb 27 '15 at 14:33