1

I'm studying the implementation of several SVD algorithms. Some of them want a symmetric matrix, so you can decompose $AA^T$ or $A^{T}A$ and get the singular values of $A$ as square roots of the eigenvalues.

Suppose $A$ square real $n \times n$ and its SVD decomposition $A=USV^T$. From the two decompositions $AA^T = US^{2}U^T$ and $A^{T}A = VS^{2}V^T$ you can get $U$, $V$ and $S$.

The problem is that the proof $A=USV^T$ does not hold (see sample Matlab code below).

My hypothesis: taking the square root of a squared value loses information about its sign; if a singular value is negative, you conventionally turn it positive and change the sign of the corresponding vector, and that's not possible in this case. Is this a reasonable guess? Is there a solution in this case?


A=rand(8,8);
[U,S,~] = svd(A*A');
[V,~,~] = svd(A'*A);
U*sqrt(S)*V' - A
  • The problem is not simply that of sign errors. See this post. – Ben Grossmann May 14 '20 at 16:33
  • Long story short: you need a choice of $V$ that "matches" your choice of $U$. For an invertible matrix $A$, this means that we must have $$ USV^T = A \implies VSU^T = A^T \implies V = A^TUS^{-1}. $$ In Matlab, you can take V = A'*U/S – Ben Grossmann May 14 '20 at 16:39
  • Thank you, sounds much more reasonable now that I see it explained! I could not find the other article, it has a title too generic. – m.alessandrini May 14 '20 at 16:43

0 Answers0