1

As has been stated in this thread:

It's not hard to show that if the covariance matrix of the original data points $x_i$ was $\Sigma$, the variance of the new data points is just $u^T \Sigma u$.

However, below it has been proven that the variance isn't equal to $\Sigma$, because we haven't divided $A$ by the number of points. Then $A$ is obviously a different matrix than the covariance matrix, thus it has different eigenvectors and eigenvalues than covariance matrix.

Most sources say we should take eigenvectors of $\Sigma$ as the new basis, but here we're taking eigenvectors of $A$. I'm a bit confused...

enter image description here

user4205580
  • 2,083
  • Dividing by the number of points is just a scaling of $\mathbf{A}$ by a scalar. The eigenvectors will not change (the eigenvalues will -- they will all be scaled by the same number; the number of of points). – megas Mar 21 '15 at 18:59
  • So if I have matrix $A$ and $\alpha A$, then both have the same eigenvectors but the eigenvalues of the latter are multiplied by $\alpha$? It sounds very basic, is it some property of eigenvalues and eigenvectors? Or a theorem that states that? – user4205580 Mar 21 '15 at 19:02
  • Yes, they will have the same eigenvectors: $\mathbf{A}$ above is symmetric (in fact PSD) and can be written as $\mathbf{A}=\mathbf{U}\mathbf{\Lambda}\mathbf{U}^{T}$, where the columns of $\mathbf{U}$ are the (orthonormal) eigenvectors of $\mathbf{A}$ and $\mathbf{\Lambda}$ is a diagonal matrix containing the eigenvalues. Then, $\alpha \mathbf{A}= \alpha \mathbf{U}\mathbf{\Lambda}\mathbf{U}^{T}= \mathbf{U}\left( \alpha \mathbf{\Lambda}\right)\mathbf{U}^{T}$. – megas Mar 21 '15 at 19:04
  • Thanks. How do we know that $\Lambda$ contains the eigenvalues? – user4205580 Mar 21 '15 at 20:55
  • $\mathbf{\Lambda}$ was defined to contain the eigenvalues. If you have a symmetric matrix $\mathbf{A}$ you can always write/decompose it like that. – megas Mar 21 '15 at 23:37
  • One more thing - suppose $v$ is an eigenvector of $A$. Then $Av=\lambda v$. If I multiply both sides, I get that $\alpha Av=\alpha \lambda v$. So I can either look at it that 1) $\alpha A$ has the same eigenvector with a different eigenvalue ($(\alpha \lambda) v$), or 2) a different eigenvector with the same eigenvalue ($(\alpha) \lambda v$), right? – user4205580 Mar 28 '15 at 13:45
  • Or where am I making a mistake? – user4205580 Mar 29 '15 at 00:19
  • By convention/definition, an eigenvector has length ($l_{2}$-norm) equal to $1$. Hence, $\alpha A$ has the same eigenvector as $A$, with corresponding eigenvalue $\alpha \lambda$. – megas Mar 29 '15 at 06:28

0 Answers0