3

I'm trying to find the SVD of $$ \begin{pmatrix} 2&1&-2\\ \end{pmatrix} $$

I found $$\Sigma , u$$

But on the V matrix I got $$ \begin{pmatrix} -\frac{2}{3}&&\frac{1}{\sqrt{2}}&&\frac{-1}{\sqrt{5}} \\ -\frac{1}{3}&&0&&\frac{2}{\sqrt{5}} \\ \frac{2}{3}&&\frac{1}{\sqrt{2}}&&0 \\ \end{pmatrix} $$

While Wolfram Alpha gives : result

I checked the eigenvectors on wolfram alpha as well and they are correct: $$ \begin{pmatrix} -2&&1&&-1\\ -1&&0&&2\\ 2&&1&&0\\ \end{pmatrix} $$

I realized that the 3rd collumn of V (the one from wolfram alpha) is obtained by doing the cross product of the first two eigenvectors , why is that ?

John
  • 33

2 Answers2

2

A singular value decomposition of a $1\times 3$ matrix can be written almost "by inspection":

$$ \begin{pmatrix} 2 & 1 & -2 \end{pmatrix} = U \Sigma V^* $$

where $U,V$ are unitary (in this case, orthogonal) matrices and $\Sigma$ is a rectangular diagonal matrix with the required singular values on the diagonal:

$$ \Sigma = \begin{pmatrix} \lambda & 0 & 0 \end{pmatrix} $$

Since $U$ is a $1\times 1$ unitary matrix, actually $U = (1)$ is just the $1\times 1$ identity. $\Sigma$ is a $1\times 3$ matrix of equal Euclidean norm to $\begin{pmatrix} 2 & 1 & -2 \end{pmatrix}$, so $\Sigma = \begin{pmatrix} 3 & 0 & 0 \end{pmatrix}$.

The requirement that $V$ be unitary (or orthogonal) does not uniquely specify its entries. So long as:

$$ \begin{pmatrix} 3 & 0 & 0 \end{pmatrix} V^* = \begin{pmatrix} 2 & 1 & -2 \end{pmatrix} $$

it will meet the definition of a singular value decomposition, and from this we see that only the first column of $V$ (top row of $V^*$) is determined:

$$ V^* = \begin{pmatrix} \frac{2}{3} & \frac{1}{3} & -\frac{2}{3} \\ {-} & {-} & {-} \\ {-} & {-} & {-} \end{pmatrix} $$

The bottom two rows of $V^*$ can be filled in with any pair of unit vectors that make up an orthonormal basis for $\mathbb{R}^3$ together with the top row shown. Therefore it is not surprising that the third row might be the cross-product of the first two rows (since the third row will be determined, up to sign, by the first two rows).

hardmath
  • 37,015
2

Problem

Detailed instructions for computing the SVD abound on Math Stack Exchange. For example: SVD and the columns, SVD -obligation of normalization, how SVD is calculated in reality, Pseudo-inverse of a matrix that is neither fat nor tall?.

This problem is straightforward. Let the target matrix be the covector $$ \mathbf{A} = \left[ \begin{array}{ccr} 2 & 1 & -2 \end{array} \right] $$

Find the singular value decomposition $$ \mathbf{A} = \mathbf{U} \, \Sigma \, \mathbf{V}^{*} $$

Methods

The first link presents a table showing two paths for finding the SVD for a matrix $\mathbf{A} \in \mathbb{C}^{m \times n}_{\rho}$:

$$ \begin{array}{lll} % \text{Operation} & \text{Row space first} & \text{Column space first} \\\hline % \text{1. Construct product matrix} & \mathbf{W} = \mathbf{A}^{*} \mathbf{A} & \mathbf{W} = \mathbf{A} \, \mathbf{A}^{*} \\ % \text{2. Solve for eigenvalues} & \sigma = \tilde{\lambda} \left( \mathbf{W} \right) & \sigma = \tilde{\lambda} \left( \mathbf{W} \right) \\ % \color{blue}{\text{3. Solve for eigenvectors }} w_{k},\ k=1, \rho & \left( \mathbf{W} - \lambda_{k} \mathbf{I}_{n} \right) w_{k} = \mathbf{0} & \left( \mathbf{W} - \lambda_{k} \mathbf{I}_{m} \right) w_{k} = \mathbf{0} \\ % \text{4. Assemble domain matrix} & \mathbf{V}_{k} = \frac{w_{k}}{\lVert w_{k} \rVert_{2}} & \mathbf{U}_{k} = \frac{w_{k}}{\lVert w_{k} \rVert_{2}} & \\ % \text{5. Compute complementary domain matrix} & \mathbf{U}_{k} = \sigma_{k}^{-1} \mathbf{A} \mathbf{V}_{k} & \mathbf{V}_{k} = \sigma_{k}^{-1} \mathbf{A}^{*} \mathbf{U}_{k} & \\ % \end{array} $$

The two product matrices to wrok with are $$ \mathbf{A} \mathbf{A}^{*} = \left[ \begin{array}{c} 9 \\ \end{array} \right] \qquad \mathbf{A}^{*} \mathbf{A} = \left[ \begin{array}{rrr} 4 & 2 & -4 \\ 2 & 1 & -2 \\ -4 & -2 & 4 \\ \end{array} \right]. $$

Solution

The product matrix $\mathbf{A} \mathbf{A}^{*}$ is much easier to work with!

Singular values

The eigenvalue is $$ \lambda \left( \mathbf{A}\mathbf{A}^{*} \right) = 9 $$ which implies the singular value is $$ \sigma = \sqrt{\lambda \left( \mathbf{A}\mathbf{A}^{*} \right)} = 3 $$ The matrix of singular values is $$ \mathbf{S} = \left[ \begin{array}{c} 3 \end{array} \right] $$ There is no need for $0$ padding, so $$ \Sigma = \mathbf{S} $$

Matrix $\mathbf{U}$

The eigenvector is already normalized and represents the first column $$ \mathbf{U} = \color{blue}{\left[ \begin{array}{c} 1 \end{array} \right]} $$

Coloring distinguishes $\color{blue}{range}$ space vectors from $\color{red}{null}$ space vectors.

Column space $$ \begin{align} \color{blue}{\mathcal{R}\left(\mathbf{A}\right)} &= \text{span} \left\{ \, \color{blue}{\left[ \begin{array}{c} 1 \end{array} \right]} \, \right\} \\ % \color{red}{\mathcal{N}\left(\mathbf{A}^{*}\right)} &= \left\{ 0 \right\} \end{align} $$

Matrix $\mathbf{V}$

$$ \mathbf{V}_{1} = \frac{1}{\sigma} \mathbf{A}^{*} \mathbf{U}_{k} = \frac{1}{3} \color{blue}{\left[ \begin{array}{r} 2 \\ 1 \\ -2 \end{array} \right]} $$

For the $\color{red}{null}$ space vectors, we have options. We could call upon the process of Gram and Schmidt. Or we could eyeball the problem to find any $\color{red}{null}$ space vector, and use the cross product to find the missing vector.

Choosing the latter, we select $$ \color{red}{\mathbf{V}_{2}} = \frac{1}{\sqrt{2}} \color{red}{\left[ \begin{array}{r} 1 \\ 0 \\ 1 \end{array} \right]} $$

The missing vector is then $$ \color{blue}{\left[ \begin{array}{r} 2 \\ 1 \\ -2 \end{array} \right]} \times \color{red}{\left[ \begin{array}{r} 1 \\ 0 \\ 1 \end{array} \right]} = \color{red}{\left[ \begin{array}{r} 1 \\ -4 \\ -1 \end{array} \right]} $$

Final assembly $$ \begin{align} \mathbf{A} &= \mathbf{U} \, \Sigma \mathbf{V}^{*} % U \\ &= \color{blue}{\left[ \begin{array}{c} 1 \end{array} \right]} % Sigma \left[ \begin{array}{c} 3 \end{array} \right] % V* \left[ \begin{array}{ccc} % c1 \frac{1}{3} \color{blue}{\left[ \begin{array}{r} 2 \\ 1 \\ -2 \end{array} \right]} % c2 \frac{1}{\sqrt{2}} \color{red}{\left[ \begin{array}{r} 1 \\ 0 \\ 1 \end{array} \right]} % c3 \frac{1}{3\sqrt{2}} \color{red}{\left[ \begin{array}{r} 1 \\ -4 \\ -1 \end{array} \right]} \end{array} \right] % \\ &= \left[ \begin{array}{ccr} 2 & 1 & -2 \end{array} \right] % \end{align} $$

dantopa
  • 10,342