4

From Singular Value Decomposition, we know that:

Any $m$ x $n$ matrix A can be factored into $A=U\Sigma V^{T}$ , where $U$ and $V$ are orthogonal, and $\Sigma$ is of the same size as $A$ with all entries zero except down the main diagonal where the successive entries are $\sigma _{1}\geq ...\geq \sigma _{k} > 0 $ for some $k$ with $k\leq$ min$(m,n)$.

To find $U$, $\Sigma$, and $V$ , we can consider $AA^{T}$ and $A^{T}A$ which are symmetric matrices.

$AA^{T}=(U\Sigma V^{T})(V\Sigma^{T} U^{T})=U(\Sigma \Sigma^{T}) U^{T}$ ($\because$ V is orthogonal implies $V^{T}V=I_{n}$) $A^{T}A=(V\Sigma^{T} U^{T})(U\Sigma V^{T})=V(\Sigma^{T} \Sigma)V^{T} $ ($\because$ U is orthogonal implies $U^{T}U=I_{m}$)

From Spectral Theorem, I understand that, since $AA^{T}$ and $A^{T}A$ are symmetric matrices, $U$ must be the eigenvector matrix for $AA^{T}$, and $\Sigma \Sigma^{T}$ is the eigenvalue matrix for $AA^{T}$; whereas $V$ must be the eigenvector matrix for $A^{T}A$, and $\Sigma^{T} \Sigma$ is the eigenvalue matrix for $A^{T}A$.

But, I don't understand why the $k$ singular values on the diagonal of $\Sigma$ are the square roots of the nonzero eigenvalues of both $AA^{T}$ and $A^{T}A$. It seems like this is only true if $\Sigma \Sigma^{T}$=$\Sigma ^{2}$ and $\Sigma^{T} \Sigma$=$\Sigma ^{2}$ . But $\Sigma \Sigma^{T}$ is $m$ x $m$ matrix, whereas $\Sigma^{T} \Sigma$ is $n$ x $n$ matrix. How can both of them be equal to $\Sigma^{2}$ ?

I'm so confused. :(

Elina
  • 348
  • 1
    It need only be true for the non-zero eigenvalues (and you may try to figure out why? It is not that complicated). In fact if $n>m$ then in one of the products you will necessarily have $n-m$ zero eigenvalues. – H. H. Rugh Nov 01 '16 at 14:01

2 Answers2

2

Let's just multiply some matrices. \begin{align*} \begin{pmatrix} 2 & 0 & 0 \\ 0 & 3 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix} \begin{pmatrix} 2 & 0 & 0 & 0 \\ 0 & 3 & 0 & 0 \\ 0 & 0 & 0 & 0 \end{pmatrix} &= \begin{pmatrix} 4 & 0 & 0 & 0 \\ 0 & 9 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \end{pmatrix}, \\ \begin{pmatrix} 2 & 0 & 0 & 0 \\ 0 & 3 & 0 & 0 \\ 0 & 0 & 0 & 0 \end{pmatrix} \begin{pmatrix} 2 & 0 & 0 \\ 0 & 3 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix} &= \begin{pmatrix} 4 & 0 & 0 \\ 0 & 9 & 0 \\ 0 & 0 & 0 \end{pmatrix}. \end{align*} See how the non-zero diagonal entries of $\Sigma\Sigma^T$ and $\Sigma^T\Sigma$ agree?

Christoph
  • 24,912
0

The $\Sigma$ matrix is a sabot matrix which insures conformability between $\mathbf{U}$ and $\mathbf{V}^{*}$. In block form, use the diagonal matrix of singular values $\mathbf{S}_{\rho \times \rho}$ where $\rho$ is the matrix rank: $$ \Sigma = \left( \begin{array}{cc} \mathbf{S} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \\ \end{array} \right)_{m\times n}, \quad % \Sigma^{\mathrm{T}} = \left( \begin{array}{cc} \mathbf{S} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \\ \end{array} \right)_{n\times m} % \Sigma^{\dagger} = \left( \begin{array}{cc} \mathbf{S}^{-1} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \\ \end{array} \right)_{n\times m} $$

For example, if the target matrix $\mathbf{A}$ has full column rank $$ \Sigma = \left( \begin{array}{c} \mathbf{S} \\ \mathbf{0} \end{array} \right)_{m\times n}, \quad % \Sigma^{\mathrm{T}} = \left( \begin{array}{cc} \mathbf{S} & \mathbf{0} \\ \end{array} \right)_{n\times m} % \Sigma^{\dagger} = \left( \begin{array}{cc} \mathbf{S}^{-1} & \mathbf{0} \\ \end{array} \right)_{n\times m} $$ and $$\Sigma^{\mathrm{T}}\Sigma = \mathbf{S}^{2}.$$

For an example showing $\Sigma$ gymnastics, see SVD and linear least squares problem.

dantopa
  • 10,342
  • what does "sabot" matrix mean? I looked up the word sabot matrix on Google and all I got is your answers that use this word. Are you the only one in the universe who knows this word? – John Deterious Apr 04 '20 at 23:45
  • 1
    @John Deterious: The sabot is the padding of zeros around the $\mathbf{S}$ matrix which ensures conformability with the domain matrices $\mathbf{U}$ and $\mathbf{V}$. – dantopa Apr 05 '20 at 18:41
  • I see. Still a particularly arcane piece of terminology. I like it. Thanks. – John Deterious Apr 05 '20 at 19:58