I've read that the singular values of a matrix are equal to the $$\sigma=\sqrt{\lambda_{K}}$$ where $\lambda$ are the eigenvalue but I'm assuming this only applies to square matrices. How could I determine the eigenvalues of a non-square matrix. Pardon my ignorance.
-
1well, if you want to define them as $Av=\lambda v,$ then clearly it does not make any sense. However, you can consider $B=A^*A$ which is a square matrix and find singular values. – leshik Oct 05 '12 at 22:50
-
2The $\lambda$ are the eigenvalues, not of the original matrix $A$, but of $A^* A$ or $A A^$ (where $A^$ is the conjugate transpose). This is for both square and rectangular $A$. The nonzero eigenvalues of $A^* A$ and $A A^*$ are the same; generally the zero eigenvalues are not counted as singular values. – Robert Israel Oct 05 '12 at 23:01
2 Answers
The standard definition of an eigenvalue of $A$ is a number $\lambda$ so that for some vector $v$, $Av=\lambda v$. If you've got an $m$ by $n$ matrix, $v$ must be a vector of length $m$, and then $Av$ must be a vector of length $n$. Thus if $m\not= n$, there is no way for $\lambda v$ to be equal to $Av$, since the two vectors are of different dimensions.
As leshik suggests in the comments, we need to use an alternative definition if we want to look at the "eigenvalues" of a rectangular matrix. An analog is to find a unit vector $\hat{u}$ so that $Mv=\sigma u$ and $M^*u = \sigma v$. Then $u$ and $v$ are the left- and right-singular vectors for the value $\sigma$ (the analogs to an eigenvector of a square matrix).

- 26,963
Eigenvalues aren't defined for rectangular matrices, but the singular values are closely related: The right and left singular values for rectangular matrix M are the eigenvalues of M'M and MM'.

- 1,216