2

let us suppose we have following matrix
$ A= \left[ {\begin{array}{cc} 2 & 2 \\ -1 & 1 \\ \end{array} } \right] $

and i want to compute SVD of this matrix, i have calculated first of all
$A*A'$ , which is equal to

$ \left[ {\begin{array}{cc} 8 & 0 \\ 0 & 2 \\ \end{array} } \right] $

because it is symmetric matrix, i can decompose it using eigenvalue decomposition, its corresponding eigenvalues are

$8$ and $2$, while Eigenvectors are

$ U= \left[ {\begin{array}{cc} 1 & 0 \\ 0 & 1 \\ \end{array} } \right] $

for computing right singular vectors i used $A'*A $ which is equal to

$ \left[ {\begin{array}{cc} 5 & 3 \\ 3 & 5 \\ \end{array} } \right] $

eigenvectors of this matrix is equal to

$ V=\left[ {\begin{array}{cc} 1 & -1 \\ 1 & 1 \\ \end{array} } \right] $

and diagonal matrix will be

$ E=\left[ {\begin{array}{cc} \sqrt{8} & 0 \\ 0 & \sqrt{2} \\ \end{array} } \right] $ so original matrix equal to

$A=U*E*V'$

but it gives me different result and why? where i am making mistake?

2 Answers2

2

Compute the singular value decomposition of a matrix $\mathbf{A}\in\mathbb{C}^{m\times n}_{\rho}$ $$ \mathbf{A} = \mathbf{U} \, \Sigma \, \mathbf{V}^{*} = % U \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}} \end{array} \right] % Sigma \left[ \begin{array}{cc} \mathbf{S}_{\rho\times \rho} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \end{array} \right] % V \left[ \begin{array}{c} \color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} \\ \color{red}{\mathbf{V}_{\mathcal{N}}}^{*} \end{array} \right] \\ \\ \tag{1} $$ The beauty of the SVD is that it provides an orthonormal basis for the four fundamental subspace for a matrix $\mathbf{A}\in\mathbb{C}^{m\times n}$ $$ \begin{align} % \mathbf{C}^{n} = \color{blue}{\mathcal{R} \left( \mathbf{A}^{*} \right)} \oplus \color{red}{\mathcal{N} \left( \mathbf{A} \right)} \\ % \mathbf{C}^{m} = \color{blue}{\mathcal{R} \left( \mathbf{A} \right)} \oplus \color{red} {\mathcal{N} \left( \mathbf{A}^{*} \right)} % \end{align} $$ To compute the SVD,

  1. Resolve the domain by finding eigenvectors of $\mathbf{A}^{*}\mathbf{A}$. Outputs: matrix of singular values $\mathbf{S}$, $\color{blue}{\mathbf{V}_{\mathcal{R}}}$.
  2. Compute $\color{blue}{\mathbf{U}_{\mathcal{R}}}$ using $\mathbf{S}$ and $\color{blue}{\mathbf{V}_{\mathcal{R}}}$


1. Resolve $\ \color{blue}{\mathcal{R} \left( \mathbf{A}^{*} \right)}$

Step 1:

Compute product matrix $$ % \begin{align} % \mathbf{W} = \mathbf{A}^{T}\mathbf{A} = % \left[ \begin{array}{cr} 2 & -1 \\ 2 & 1 \\ \end{array} \right] \left[ \begin{array}{rr} 2 & 2 \\ -1 & 1 \\ \end{array} \right] % = % \left[ \begin{array}{cc} 5 & 3 \\ 3 & 5 \\ \end{array} \right] % \end{align} % $$

Step 2:

Compute eigenvalue spectrum $\lambda \left(\mathbf{W}\right)$

$$ \det \mathbf{W} = 16, \qquad \text{trace } \mathbf{W} = 10 $$ The characteristic polynomial is $$ p(\lambda) = \lambda^{2} - \lambda \text{ trace } \mathbf{W} + \det \mathbf{W} = \lambda ^2-10 \lambda +16 = \left( \lambda - 8 \right) \left( \lambda - 2 \right) $$ The roots of the $p(\lambda)$ are the eigenvalues of $\mathbf{W}$: $$ \lambda \left(\mathbf{W}\right) = \left\{ 8, 2 \right\} $$

Step 3:

Compute singular value spectrum $\sigma$

To obtain the singular values: form $\tilde{\lambda}$, a list arranged in decreasing order with $0$ values culled: $$ \sigma = \sqrt{\tilde{\lambda}} = \left\{ 2\sqrt{2}, \sqrt{2} \right\} $$ The singular values are the diagonal entries of the $\mathbf{S}$: $$ \boxed{ \mathbf{S} = \sqrt{2}\left[ \begin{array}{cc} 2 & 0 \\ 0 & 1 \\ \end{array} \right] } $$

Step 4:

Compute eigenvectors of $\mathbf{W}$

Fundamental tool: eigenvalue equation $$ \mathbf{W} v_{k} = \lambda_{k} v_{k}, \qquad k = 1, 2 $$

$k=1$: $$ % \begin{align} % \mathbf{W} v_{1} &= \lambda_{1} v_{1} \\ % \left[ \begin{array}{cc} 5 & 3 \\ 3 & 5 \\ \end{array} \right] % \left[ \begin{array}{c} x \\ y \\ \end{array} \right] % &= % 8 % \left[ \begin{array}{c} x \\ y \\ \end{array} \right] \\[3pt] % % % \left[ \begin{array}{c} 5 x + 3 y \\ 3 x + 5 y \\ \end{array} \right] &= \left[ \begin{array}{c} 8x \\ 8y \\ \end{array} \right]\\[3pt] % % % \left[ \begin{array}{c} x \\ y \\ \end{array} \right] &= \left[ \begin{array}{c} 1 \\ 1 \\ \end{array} \right] % \end{align} % $$ The normalized vector is the first column vector in $\color{blue}{\mathbf{V}_{\mathcal{R}}}$. $$ \hat{v}_{1} = \frac{1}{\sqrt{2}} \left[ \begin{array}{r} 1 \\ 1 \\ \end{array} \right] $$

$k=2$: $$ % \begin{align} % \mathbf{W} v_{2} &= \lambda_{2} v_{2} \\ % \left[ \begin{array}{cc} 5 & 3 \\ 3 & 5 \\ \end{array} \right] % \left[ \begin{array}{c} x \\ y \\ \end{array} \right] % &= % 2 % \left[ \begin{array}{c} x \\ y \\ \end{array} \right] \\[3pt] % % % \left[ \begin{array}{c} 5 x + 3 y \\ 3 x + 5 y \\ \end{array} \right] &= \left[ \begin{array}{c} 2x \\ 2y \\ \end{array} \right]\\[3pt] % % % \left[ \begin{array}{c} x \\ y \\ \end{array} \right] &= \left[ \begin{array}{c} -1 \\ 1 \\ \end{array} \right] % \end{align} % $$ The normalized vector is the second column vector in $\color{blue}{\mathbf{V}_{\mathcal{R}}}$. $$ \hat{v}_{2} = \frac{1}{\sqrt{2}} \left[ \begin{array}{r} -1 \\ 1 \\ \end{array} \right] $$ Assemble: $$ \boxed{ \color{blue}{\mathbf{V}_{\mathcal{R}}} = \frac{1}{\sqrt{2}} % \left[ \begin{array}{cr} 1 & -1 \\ 1 & 1 \\ \end{array} \right] } $$

2. Resolve $\ \color{blue}{\mathcal{R} \left( \mathbf{A} \right)}$

Rearrange (1) to recover $$ \color{blue}{\mathbf{U}_{\mathcal{R}}} = \mathbf{A} \color{blue}{\mathbf{V}_{\mathcal{R}}} \mathbf{S}^{-1} $$

The power of the SVD is that is aligns the $\color{blue}{range}$ spaces and accounts for scale differences. This allows direct computation using equation (1): $$ \begin{align} \color{blue}{\mathbf{U}_{\mathcal{R}}} = \mathbf{A} \color{blue}{\mathbf{V}_{\mathcal{R}}} \mathbf{S}^{-1} % &= \left[ \begin{array}{rc} 2 & 2 \\ -1 & 1 \\ \end{array} \right] % \frac{1}{\sqrt{2}} \left[ \begin{array}{cr} 1 & -1 \\ 1 & 1 \\ \end{array} \right] % Sinv \left[ \begin{array}{cc} \frac{1}{2 \sqrt{2}} & 0 \\ 0 & \frac{1}{\sqrt{2}} \\ \end{array} \right] % \end{align} % $$ At last, $$ \boxed{ \color{blue}{\mathbf{U}_{\mathcal{R}}} = \left[ \begin{array}{cc} 1 & 0 \\ 0 & 1 \\ \end{array} \right] } $$


Final answer

$$ \mathbf{A} = \color{blue}{\mathbf{U}_{\mathcal{R}}} \mathbf{S} \color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} = % \left[ \begin{array}{cc} 1 & 0 \\ 0 & 1 \\ \end{array} \right] % \sqrt{2} \left[ \begin{array}{cc} 2 & 0 \\ 0 & 1 \\ \end{array} \right] \frac{1}{\sqrt{2}} % \left[ \begin{array}{cr} 1 & -1 \\ 1 & 1 \\ \end{array} \right] % % $$

dantopa
  • 10,342
1

First problem: selecting your $U$ determines the matrix $V$. After you find $U$, all that remains is $$ A = U \Sigma V' \implies V = \Sigma^{-1}U'A $$ The second problem is that when you found $V$, you didn't normalize the columns. This is not strictly relevant, however, since your method of finding $V$ in the first place was incorrect.

See my answer here for a more thorough explanation. My answer is about how one can compute the SVD by finding $V$ first, but the idea is the same.

Ben Grossmann
  • 225,327