I'm studying the implementation of several SVD algorithms. Some of them want a symmetric matrix, so you can decompose $AA^T$ or $A^{T}A$ and get the singular values of $A$ as square roots of the eigenvalues.
Suppose $A$ square real $n \times n$ and its SVD decomposition $A=USV^T$. From the two decompositions $AA^T = US^{2}U^T$ and $A^{T}A = VS^{2}V^T$ you can get $U$, $V$ and $S$.
The problem is that the proof $A=USV^T$ does not hold (see sample Matlab code below).
My hypothesis: taking the square root of a squared value loses information about its sign; if a singular value is negative, you conventionally turn it positive and change the sign of the corresponding vector, and that's not possible in this case. Is this a reasonable guess? Is there a solution in this case?
A=rand(8,8);
[U,S,~] = svd(A*A');
[V,~,~] = svd(A'*A);
U*sqrt(S)*V' - A
V = A'*U/S
– Ben Grossmann May 14 '20 at 16:39