3

I have written an algorithm to compute the SVD of a 2x2 matrix. I was checking against a Mathematica query, and I noticed that the signs in the $U$ and $V$ matrices do not match those from my algorithm. Does this matter, or is it still a valid decomposition?

I plan to use the SVD to compute the polar decomposition as well. Do the SVD signs matter for use in the polar decomposition?

Here is an outline of the algorithm:

//Compute A^T*A
a = A(0,0)*A(0,0) + A(1,0)*A(1,0)
b = A(0,1)*A(0,1) + A(1,1)*A(1,1)
c = A(0,0)*A(0,1) + A(1,0)*A(1,1)

//Compute eigenvalues
root = sqrt((a-b)*(a-b) + 4*c*c)
eig1 = (a+b+root)/2
eig2 = (a+b-root)/2

//Compute singular values for "Sigma" matrix
s1 = sqrt(eig1)
s2 = sqrt(eig2)

//Compute eigenvectors for "V" matrix
eigvec1 = (c, eig1-a)
eigvec1.normalize()
eigvec2 = (-eigvec1(1), eigvec1(0))

//Compute columns of "U" matrix
col1 = (
    (A(0,0)*eigvec1(0) + A(0,1)*eigvec1(1)) / s1,
    (A(1,0)*eigvec1(0) + A(1,1)*eigvec1(1)) / s1,
)
col2 = (-col1(1), col1(0))

Edit:

I just found out that this algorithm won't work with the matrix: $\begin{bmatrix}1 & 1 \\ 1 & 0\end{bmatrix}$. The $U$ and $V$ matrices are correct, except the signs. How do I correct them to give proper signs?

Azmisov
  • 249
  • 2
  • 12
  • Okay, so I've done some more research, and I came across this question: http://stackoverflow.com/questions/16935181/svd-value-different-between-matlab-2011b-and-2012b. The answer there says the signs do not matter. – Azmisov May 16 '14 at 04:22
  • The updated algorithm is given at scicomp.SE: http://scicomp.stackexchange.com/questions/8899/robust-algorithm-for-2x2-svd – Azmisov May 17 '14 at 20:18
  • Did you follow any standard textbook or a paper in the design of your algorithms? – Vini Mar 13 '17 at 16:20
  • @Vini My answer to this question, http://scicomp.stackexchange.com/a/11646/8196, lists two papers that I used (not technical papers though). The algorithm I have here is the standard way to solve a matrix by hand (which ends up being fastest for 2x2 matrices). I mostly used the papers for optimizations when the matrix is diagonal, symmetric, zero roots, etc. – Azmisov Mar 13 '17 at 19:08
  • The sign ambiguity is resolved here: https://math.stackexchange.com/questions/1805191/calculating-svd-by-hand-resolving-sign-ambiguities-in-the-range-vectors/2239200#2239200 – dantopa Mar 16 '18 at 00:18

1 Answers1

1

It is a bad idea to compute the SVD of a matrix by forming $A^tA$ and $AA^t$. Numerical analysts normally apply orthogonal transformations to make the $A$ matrix bidiagonal. The SVD of the bidiagonal matrix is computed using standard algorithms such as the QR-algorithm, divide & conquer or bisection. Please see a standard textbook such as "Matrix Computations" by Gene Golub and Charles Van Loan.

For your particular problem, the matrix is already symmetric. In that case, we use the spectral decomposition $$ A = U \Lambda U^t $$ where $\Lambda$ is the diagonal eigenvalue matrix and the orthogonal $U$ is the eigenvector matrix. Since it is orthogonal, we have $U^tU = $, where $I$ is the identity matrix. Then $$ \Lambda = \left[ \begin{array}{cc} -0.6180 & 0 \\ 0 & 1.6180 \end{array} \right] $$ $$ U = \left[ \begin{array}{cc} 0.5257 & -0.8507 \\ -0.8507 & -0.5257 \end{array} \right] $$ We first want the eigenvalues in the decreasing order. Permuting the columns of $U$ and applying the same permutation on $\Lambda$, we have $$ \Lambda = \left[ \begin{array}{cc} 1.6180 & 0 \\ 0 & -0.6180 \end{array} \right] $$ $$ U = \left[ \begin{array}{cc} -0.8507 & 0.5257 \\ -0.5257 & -0.8507 \end{array} \right] $$ Note that $A=U\Lambda U^t$ is still a spectral decomposition. However, in the SVD, all the singular values have to be non-negative. To fix that we multiply the second column of $\Lambda$ by $-1$ and to compensate that action we multiply the second row of $U^t$ by $-1$. We call this matrix by $V^t$. Now we have, the SVD of $A$ as $$ A = U\Sigma V^t $$ where $$ \Sigma = \left[ \begin{array}{cc} 1.6180 & 0 \\ 0 & 0.6180 \end{array} \right] $$
$$ V = \left[ \begin{array}{cc} -0.8507 & -0.5257 \\ -0.5257 & 0.8507 \end{array} \right]. $$

Now, I hope you understand the difference between the spectral decomposition and the SVD of a symmetric matrix. The spectral decomposition and the SVD of a real symmetric have common properties but we have to be careful when we use them.

(a) The singular values are always ordered in the non-increasing order. The eigenvalues of a real symmetric matrix are (usually) ordered from smallest to the largest.

(b) If $x$ is an eigenvector of the matrix $A$ then $-x$ is also an valid eigenvector. The same is true of the singular vectors. However, if the sign of a column of $U$ is changed then the corresponding column of the $V$ has to be changed.

Vini
  • 526
  • 3
  • 9
  • Interesting thoughts. In regards to it being a bad idea to solve via $A^TA$, I agree in the general case. However, for 2x2/3x3 matrices, I have read that there are few enough terms that the explicit formula will be just as numerically robust as any other. I have also read that most divide-and-conquer methods reduce to explicitly computing 2x2 matrices anyways. – Azmisov Mar 17 '17 at 16:42
  • I would like see the citations you have mentioned. – Vini Mar 18 '17 at 09:52
  • I would like see the citations you have mentioned. Usually, in the $2 \times 2$ case, the matrix is transformed to a $2 \times 2$ upper triangular matrix using a Givens rotation. Then it is carefully, diagonalised using an SVD algorithm. I would not form $A'A$ or $AA'$ in the $3 \times 3$ case either. There is a good implementation of $2 \times 2$ in the LAPACK. – Vini Mar 18 '17 at 10:00
  • You certainly could be right. This was three years ago, so I'm not sure where I read it and I don't really care enough to go find citations. – Azmisov Mar 19 '17 at 22:43