1

Is there a correct/stable way of dealing with repeated eigenvalues in S?

I was working on a Moore-Penrose Inverse in a library for a programming language at work. The dummy matrix I picked turned out to be an edge-case for the library's SVD decomposition:

\begin{pmatrix}0&1&0&0\\ 1&0&0&0\\ 0&0&1&0\end{pmatrix}

The eigen(values, vectors) of MMᵀ are: (1, <1,0,0>), (1, <0,1,0>), (1, <0,0,1>)

And for MᵀM: (1, <1,0,0,0>), (1, <0,1,0,0>), (1, <0,0,1,0>), (0, <0,0,0,1>)

(Order matches what is returned by the library's eigenvector method)

The library is taking the eigenvectors in the order given as-is, constructing U and V from them by treating them as column vectors, and then solving for S. Although the way it solves for S doesn't seem right (it doesn't take the inverse of U or V), but somehow works in general: $$S = U^TMV$$ The problem is that the system it comes up with is just the original matrix multiplied with two identity matrices:

$$\pmatrix{1&0&0\\0&1&0\\0&0&1} \pmatrix{0&1&0&0\\1&0&0&0\\0&0&1&0} \pmatrix{1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1}$$

And S obviously isn't diagonal. Wolfram, of course, ensures that isn't the case.

One thing I'm not clear on is whether the values of the diagonal in S are the eigenvalues of MᵀM/MMᵀ or of U/V, but either way in this case it's all 1 (except for that 0 in MᵀM). However, I understand that I can permute S to be diagonal so long as the eigenvalues remain in descending order and the order of the corresponding eigenvectors are also permuted. However, I'm not sure if U, V, or both need to be permuted to be consistent. All I could find was an excerpt from this question:

If Σ has repeating diagonal elements, much more can be done to change U and V (for example, one or both can permute corresponding columns).

In this case, if I swap the first and second columns of S and the corresponding columns of U, I get a valid U matrix and a diagonal S matrix. But I feel like that procedure isn't going to hold in the general case. And even then, it's very different from what wolfram spits out.

So, to repeat the original question, how does one consistently deal with the SVD of a matrix when it has repeated eigenvalues?

Optimum
  • 41

1 Answers1

1

Was able to get an answer from a PhD in the mathematics discord, I'll just leave it here:

"You can choose here whether you want to permute rows or columns, which is equivalent to multiplying by a permutation matrix on the left or right respectively. Suppose you want to multiply on the left, so PS is diagonal. Then, since $M = USV^T$, then also $M = UPP^{-1}SV^T$, and your SVD decomposition uses the three matrices $UP^{-1}$, $SV$, and $V$. Similarly, if you want to multiply on the right by a permutation, $Q$, then SQ is diagonal. Then also $M = USQQ^{-1}V^T$."

With that, you can work out a bit of an algorithm:

  1. Pick the max of $S$'s row and column counts max(m, n)
  2. Extract the vectors that span the perpendicular dimension into an array (if there are more columns, extract the column vectors)
  3. Sort the zero-vectors out of the diagonal (if there is a 0 eigenvalue this won't affect anything)
  4. Rearrange the vectors containing non-zero entries to be diagonal.
  5. Taking all those sorting operations and applying them to an Identity matrix (with dimensions equal to the max) will yield a corresponding permutation matrix, $P$.
  6. You can then multiply $S$ by $P$ to get the new $S$ matrix; whether pre- or post-multiplication is determined by the dimensions of $P$.
  7. Lastly, post-multiply $U$ or pre-multiply $V$ by $P^{-1}$ to get a new unitary matrix, choosing based on which one matches the dimensions of $P$.
Optimum
  • 41