27

Consider the SVD of matrix $A$:

$$A = U \Sigma V^\top$$

If $A$ is a symmetric, positive semidefinite real matrix, is there a guarantee that $U = V$?

Second question (out of curiosity): what is the minimum necessary condition for $U = V$?

Sohail Si
  • 335

4 Answers4

15

Here is an attempt to provide a clear answer, building upon Arash's answer.

Primer:

  • Any matrix $A$ can be decomposed with Singular Value Decomposition (SVD) as $A = U \Sigma V^\top$. $U$ and $V$ are unitary matrices. This decomposition is not unique: the singular values part $\Sigma$ is unique ; however the signs in the left and right singular vectors can be interchanged. Besides when at least one singular value is zero, there are many possible corresponding singular vectors. The following hold (source):

    • the singular values are equal to the square roots of the eigenvalues of $AA^\top$ (or the ones of $A^\top A$) (resp. $AA^*$ or $A^*A$ for complex matrices)
    • the right singular vectors (columns of $V$) are eigenvectors of $A^\top A$ (resp. $A^*A$)
    • the left singular vectors (columns of $U$) are eigenvectors of $AA^\top$ (resp. $AA^*$)
  • if $A$ is real symmetric then (spectral theorem) it is diagonalizable and therefore has at least one eigendecomposition $A = Q \Lambda Q^{-1} = Q \Lambda Q^\top $. (this post shows a non-diagonalizable counterexample of a complex symmetric matrix). In general this decomposition is not unique: the eigenvalues part $\Lambda$ is unique ; however the eigenvectors part $Q$ is only unique if no eigenvalue is zero.

  • so, if $A$ is real symmetric

    • its singular values are the absolute values (modulus if complex) of its eigenvalues.
    • both the right and left singular vectors (columns of $V$ and $U$) are eigenvectors of $A^\top A = AA^\top = A^2 = Q \Lambda^{2} Q^{-1}$, so they are both eigenvectors of $A$. Also, remember that they are unit vectors: so they are either equal to vectors in $Q$ or to $-1$ times these vectors.

Now to translate this in an answer to your question:

  • if $A$ is real symmetric and positive definite (i.e. all of its eigenvalues are strictly positive), $\Sigma$ is a diagonal matrix containing the eigenvalues, and $U=V$.

  • if $A$ is real symmetric and only semi-positive definite (i.e. all of its eigenvalues are positive but some of its eigenvalues can be zero), $\Sigma$ is a diagonal matrix containing the eigenvalues, but there is no guarantee that $U=V$. Indeed the part of $U$ and $V$ corresponding to the zero eigenvalues can be any orthonormal decomposition of the null space of $A$, with sign flips allowed independently on $U$ and $V$.

  • if $A$ is only real symmetric and not semi-positive definite (i.e. some of its eigenvalues can be negative), then $\Sigma$ is a diagonal matrix containing the absolute values of the eigenvalues. There are then two reasons for there being no guarantee that $U=V$. If there is a zero eigenvalue, then see previous bullet point. If there is a negative eigenvalue, then the sign "taken off" the eigenvalue in $\Lambda$ to construct the (positive by definition) $\Sigma$ to make it positive has to end up either on $U$ or $V$. For a concrete example consider a diagonal matrix with at least one negative element.

As noted by Arash you can replace in all the above statements the words "real symmetric" with "normal".

So to conclude a minimum condition for $U=V$ is to be normal and positive definite. Now is this necessary ? Is it proven that non-normal matrices can not have strictly positive eigenvalues ? This is the part I'm not sure about.

smarie
  • 251
  • Thank you for the intuitive, educational and elaborate explanation. – Sohail Si Jul 07 '20 at 09:39
  • 1
    You're welcome ! Thanks for the feedback @SohailSi – smarie Jul 07 '20 at 14:08
  • @smarie I can't understand why the spectral theorem says $U = V$ as you mentioned. Could you explain this more, please? – Figurinha Sep 10 '20 at 14:35
  • @Figurinha can you tell which part of the answer is not clear enough ? – smarie Sep 10 '20 at 18:53
  • @smarie I know that the spectral theorem says we have a decomposition $A = QDQ^T$, where $D$ is diagonal with its elements being the eigenvalues of $A$ and that $Q^TQ = QQ^T = I$, but I can't see why this implies $U = V$ in the SVD... What is not clear for me is: if $A = U \Sigma V^T$ and $A = QDQ^T$, with $\Sigma = D$, why should $U = V$? Why couldn't we have different decompositions for $A$ such that $A = UDV^T$ and also $A = QDQ^T$? – Figurinha Sep 11 '20 at 14:34
  • I edited the answer, it should now be clearer that elements in U and V are eigenvectors of A in that case, and they are unit vectors - so there are not many degrees of freedom left than changing the sign (phase if complex). – smarie Sep 13 '20 at 13:43
  • Normal + positive definite implies Hermitian. A normal matrix is unitarily diagonalizable by the spectral theorem (this is actually an equivalence), and it is positive definite if and only if its eigenvalues are real and positive, which implies that it is Hermitian. – reded Jan 13 '24 at 12:09
7

First of all see that $U$ and $V$ are not unique in general. However you might be able to find a relation between distinct SVD of a matrix $A$ and working with real matrix makes things easier.

For a general real $A$, let singular values of $A$ be non-zero. If $A=U_1\Sigma V_1^T$ and $A=U_2\Sigma V_2^T$ then from this link, there is a diagonal matrix $D=\mathrm{diag}(\pm 1,\dots,\pm 1)$ such that: $$ U_1=U_2D, V_1=V_2D. $$ Now suppose that $A$ is a normal matrix with positive eigenvalues. It can be orthogonally diagonalized. Then we can see that: $$ A=UDU^{T} $$ This is a SVD of $A$. So for $A=U_1\Sigma V_1^T$ then $U_1=UD$ and $V_1=UD$ which implies that $U_1=V_1$. In other words having normal matrix with positive eigenvalues is sufficient for having $U=V$. This class includes positive definite matrices. When zero singular values are permitted, then the situation is more tricky. Take zero matrix for instance.

Arash
  • 11,131
3

If the matrix is symmetric then $U=V$, as the by the spectral theorem we know that the eigenvalue decomposition and the singular value decomposition must be the same. From that we see that $U = U\Lambda U^{-1}=U\Lambda U^T=U\Sigma V^T$, and as by the theorem $\Sigma = \Lambda$ then $U=V$.

  • your counter example is not symmetric – tibL Sep 15 '16 at 12:24
  • 1
    That's why it does not hold that $U=V$ – Josu Etxezarreta Martinez Sep 15 '16 at 12:25
  • in the question it says $A$ is symmetric... – tibL Sep 15 '16 at 12:26
  • 1
    Ok, I didn't read that, I'll edit now – Josu Etxezarreta Martinez Sep 15 '16 at 12:27
  • 2
    Why does $V \Sigma U^t = U \Sigma V^t$ tell you that $U = V$? – John Hughes Sep 15 '16 at 12:34
  • Because if A symmetric, then it is equal to it's transpose, so if we do the svd of the transpose we should get the same decomposition as A. Then if we do the decomposition of $A^T$ and we get that it's U matrix is the V matrix, then they should be equal. – Josu Etxezarreta Martinez Sep 15 '16 at 12:44
  • $U$ and $V$ of SVD are not unique in general. By transpose you might recover another SVD; see http://math.stackexchange.com/questions/644327/how-unique-on-non-unique-are-u-and-v-in-singular-value-decomposition-svd – Arash Sep 15 '16 at 13:23
  • However, by the spectral theorem, a symmetric matrix will have real valued eigenvlaues, making the eigenvalue decomposition and the singular value decomposition the same for a symmetric matrix, so even if the svd is not unique, U and V should be the same. – Josu Etxezarreta Martinez Sep 15 '16 at 13:29
  • If $A=U\Sigma V$ then $A=(-U)\Sigma(-V)$ but $U\neq -U$. Again check the question I posted. – Arash Sep 15 '16 at 13:33
  • I see that, but what I am saying is that if the eigenvalue decomposition and the svd are the same (thing that happens for symmetric matrices, from the spectral theorem), then U=V. $A=U\Sigma V^T = U\Lambda U^{-1}$, and from spectral theorem: $\Sigma = \Lambda$, $U^{-1}=U^T$, so then U=V. – Josu Etxezarreta Martinez Sep 15 '16 at 13:36
  • I corrected the answer according to what I just stated. – Josu Etxezarreta Martinez Sep 15 '16 at 14:05
  • Why $A=U\Sigma V^T = U\Lambda U^{-1}$? It is more $A=U\Sigma V^T = U_1\Lambda U_1^{T}$ for a $U_1$ that is possibly distinct from $U$ and $V$. – Arash Sep 15 '16 at 14:05
  • yes, $U_1$ can be different to $U$, but as the singular values and the eigenvalues are the same, then the form of the decomposition has to be the same, implying that right and left singular vector have to be the same for the singular value decomposition. – Josu Etxezarreta Martinez Sep 15 '16 at 14:09
  • http://math.stackexchange.com/questions/22825/how-to-compute-svd-singular-value-decomposition-of-a-symmetric-matrix – Josu Etxezarreta Martinez Sep 15 '16 at 14:12
  • But it has to be rigorously proved that "their form has to be the same". And what do you mean by "the same" singular vectors? You can talk about equivalent singular vectors with proper notion of equivalence. – Arash Sep 15 '16 at 14:12
  • This answer is incorrect to me. Consider a diagonal matrix with at least one negative number on it: it is symmetric, but U is different from V in the SVD decomposition (sign flip). See my answer below https://math.stackexchange.com/a/3683742/769194 – smarie May 20 '20 at 14:07
1

Self-anwer:

Firstly, note the emphasis on being positive semi-definite. As others said correctly, if $\mathbf A$ is singular, there is no such guarantee, and $\mathbf U$ and $\mathbf V$ can be different. As @Arash said, consider zero matrix, the SVD is not unique.

However, if we consider the column space or span of $\mathbf A$, and project $\mathbf U$ and $\mathbf V$ on this space, the projected U and V are equal.

It seems non-singularity also provides the necessary condition for $\mathbf U=\mathbf V$. But I need to double-check this.

  • Positive semi-definite: No.
  • Positive-definite: Yes. (Sufficient)

The next question could be, what is (are) the minimum necessary conditions for U=V? (not just a sufficient condition, e.g. pos-def ).

Sohail Si
  • 335