186

I am trying to prove some statements about singular value decomposition, but I am not sure what the difference between singular value and eigenvalue is.

Is "singular value" just another name for eigenvalue?

glS
  • 6,818
Ramon
  • 1,869
  • 4
    They agree in finite dimensions, but not necessarily for infinite-dimensional operators. I've heard the term "singular value" applied to any value for which $(A-\lambda I)^{-1}$ either does not exist or is not continuous, while eigenvalues refer only to those values for which $(A-\lambda I)^{-1}$ does not exist. – Alex Becker Apr 03 '12 at 03:30
  • 4
    The singular value is a nonnegative scalar of a square or rectangular matrix while an eigenvalue is a scalar (any scalar) of a square matrix. – Hassan Muhammad Apr 03 '12 at 03:37
  • ^Note that I was addressing square matrices specifically, or in the infinite-dimensional case, endomorphisms. – Alex Becker Apr 03 '12 at 03:56
  • 2
    My guess is that the question is about the singular value decomposition for matrices of finite-dimensional operators. – yep Apr 03 '12 at 04:02
  • 8
    They are not the same thing at all, and has nothing to do with dimension. They only agree in the special case where the matrix is symmetric. This agreement also extends (in a sense) for infinite dimensional compact operators. – Nick Alger Sep 30 '12 at 02:46
  • 1
    @AlexBecker Perhaps you are thinking of the singular spectrum of an infinite dimensional operator instead? (an unrelated topic) – Nick Alger Sep 30 '12 at 03:06
  • 9
    @AlexBecker : The DO NOT agree in finite dimensions! Clearly you're not familiar with the singular value decomposition. All real matrices have singular values, but non-square matrices don't have eigenvalues. – Michael Hardy Jan 23 '13 at 04:34
  • Just watch this video (https://www.youtube.com/watch?v=rYz83XPxiZo&t=48s) and you will understand. – Christina Oct 10 '21 at 14:58

7 Answers7

162

The singular values of a $M\times N$ matrix $X$ are the square roots of the eigenvalues of the $N\times N$ matrix $X^*\,X$ (where $^*$ stands for the transpose-conjugate matrix if it has complex coefficients, or the transpose if it has real coefficients).

Thus, if $X$ is $N\times N$ real symmetric matrix with non-negative eigenvalues, then eigenvalues and singular values coincide, but it is not generally the case!

Student
  • 3,496
  • 7
    what about the case in which X is square but not symmetric. The eigenvalues of X could be negative correct? And in that case how do we define the singular values? – Matteo Apr 09 '14 at 18:42
  • 4
    @matteo: I don't understand your question. Whatever matrix $X$ you choose (square or not), the matrix $X^*X$ is Hermitian (or symmetric is the entries are real) positive definite, and the definition I provided makes sense. If $X$ is a square matrix with a negative eigenvalue, then its eigenvalues and singular values are just not the same. – Student Apr 21 '14 at 16:07
  • I guess you're right, I wasn't really thinking of the fact that they're simply different. Just to make sure about one last thing, is $X^*X$ always hermitian and positive definite? – Matteo Apr 21 '14 at 16:33
  • 1
    @Matteo: Yes, it is clearly Hermitian, but only non-negative definite, sorry (i.e. you can have zero for singular value): For every $v\in \mathbb C^n$, with $\langle .\rangle$ the usual Hermitian product on $\mathbb C^n$, you have $\langle X^*Xv,v\rangle = \langle X v,X v\rangle=|Xv|^2\geq 0$. – Student Apr 21 '14 at 17:13
  • 1
    Isn't it sufficient for the square matrix to be diagonalizable, rather than symmetric (ie orthogonally diagonalizable) for the singular values to be the same as the eigenvalues? – shj May 21 '14 at 19:12
49

Given a matrix $A$, if the eigenvalues of $A^HA$ are $\lambda_i \geq 0$, then $\sqrt{\lambda_i}$ are the singular values of $A$. If $t$ is an eigenvalue of $A$, then $|t|$ is a singular value of $A$. And here is an example should be noticed, $$A = \begin{pmatrix}1&0&1\\0&1&1\\0&0&0\end{pmatrix},$$ the eigenvalues of $A$ are $1,1,0$ while the singular values of $A$ are $\sqrt{3},1,0$.

Eden Harder
  • 1,131
  • 13
    That's a very nice example. Another example is $$\mathbf A = \left(\begin{array}{cc}1 & 1 \0&0\end{array}\right).$$ The eigenvalues are $1$ and $0$, the singular values are $\sqrt 2$ and $0$. – hbp Oct 22 '15 at 06:18
  • 12
    "If $t$ is an eivenvalue of A, then $|t|$ is a singular value of A" - this is not true, though it does convey part of the (admittedly vague, but still useful) intuition that the eigenvalues and singular values are "the same size" – stochastic Apr 12 '16 at 22:50
  • 1
    @stochastic Why isn't it true? What's wrong with this argument: Let $t$ be an eigenvalue of $A$; then $\bar{t}$ is an eigenvalue of $A^$ and hence $\bar{t}t=|t|^2$ is an eigenvalue of $A^A$, so $|t|$ is a singular value of $A$. – Eric Kightley Jul 31 '16 at 16:52
  • @EricKightley that statement "If $t$ is an eigenvalue of A, then $|t|$ is a singular value of A" is only true if A has "n" eigenvectors. And $A=A^$ is one example where that happens, and $A=-A^$ is another example. Perhaps you were implying that, but didn't mention it. – makansij Oct 14 '17 at 18:53
  • 4
    @Hunle: no, that statement can fail even if $A$ is diagonalizable. As shown by Horn the only relationship is "Weyl's inequality" – Dap Nov 30 '17 at 07:04
  • 1
    @EricKightley The problem with your argument is that if $t$ is a right eigenvalue for $A$ ($Av=tv$ for some $v$) then $\overline t$ is a left eigenvalue for $A^$ ($\overline{Av})=\overline{v}A^=\overline{t}\overline{v}$) – Jose Brox Jan 31 '18 at 15:13
21

is singular value just another name for eigenvalue?

No, singular values & eigenvalues are different.

What is the difference between Singular Value and Eigenvalue?

There are many possible answers to this question. Since I don't know what you're trying to prove, I'd recommend carefully comparing definitions between the two: eigendecomposition, singular value decomposition

[EDIT: You might find the first several chapters of the book "Numerical Linear Algebra" by Trefethen and Bau more useful than the Wikipedia article. They're available here.]

Two important points:

  • Notice in particular that the SVD is defined for any matrix, while the eigendecomposition is defined only for square matrices (and more specifically, normal matrices).

  • Notice that singular values are always real, while eigenvalues need not be real.

yep
  • 704
14

A very clear explanation from Cleve Moler's text book https://www.mathworks.com/content/dam/mathworks/mathworks-dot-com/moler/eigs.pdf

An eigenvalue and eigenvector of a square matrix A are a scalar λ and a nonzero vector x so that $$Ax = λx.$$ A singular value and pair of singular vectors of a square or rectangular matrix A are a nonnegative scalar σ and two nonzero vectors u and v so that

$$Av = σu,$$ $$A^Hu = σv.$$

Eigenvectors and singular vectors are them same if $A$ is a real symmetric matrix (so $A^H = A$).

quacker
  • 413
1

Singular values of the SVD decomposition of the matrix A is the square root of the eigenvalues of the matrix ($A$ multiplied by $A^T$) or ($A^T$ multiplied by $A$), the two are identical with positive eigenvalues.

Hepdrey
  • 83
0

Consider the comparison between largest singular value and largest eigen value of a matrix with real entries, $A\in \mathbb{R}^{m \times n}$:

\begin{align*} \sigma_{\max}(A)&= \sup_{x \in S^{n-1}} ||Ax||_2 =\sup_{x \in S^{n-1}} \sqrt{||Ax||_2^2}= \sup_{x \in S^{n-1}} \sqrt{x^\top A^\top A x} = \sqrt{\lambda_{\max}(A^\top A)} \end{align*}

For $A^\top =A$, we have, $\sigma_{\max}(A)=\sqrt{\lambda_{\max}(A^2)}=\sqrt{\lambda^2_{\max}(A)}=|\lambda_{\max}(A)|=\sup\limits_{x \in S^{n-1}}| x^\top A x|$

-4

The eigenvalues $\mathbf{\Lambda}$ of the sample matrix $\mathbf{X}$ can be obtained applying SVD to the covariance of $\mathbf{X}$:

$$\mathbf{S} = cov(\mathbf{X}) \mapsto \mathbf{S}=\mathbf{U}\mathbf{\Lambda}\mathbf{V}^{T} $$

Freeman
  • 171