1

I want to find the singular value decomposition of $A^TA$ & $(A^TA)^{-1}$.

The singular value decomposition of $A$ is $$A=U \Sigma V^T$$

Basically, I want to find the singular values of $A^TA$ & $(A^TA)^{-1}$

$$A^TA=V{\Sigma}^2 V^T$$.

Does this mean that the singular values of $A^TA$ is equal to square of the singular values of $A$?

How to find the $(A^TA)^{-1}$?

Oily
  • 517
  • You can just substitute the SVD of $A$ into $AA^T$ like so: $AA^T = U \Sigma V^T (U \Sigma V^T)^T$, then simplify this expression, and at the end you will end out with a SVD for $AA^T$. – Nick Alger Aug 05 '16 at 06:30
  • Generally speaking it is recommended to use existing tags rather than making up new ones. This is why I edited your post to use the tags (svd) instead of (sv-decomposition), and (matrices) instead of (matrix-calculus). By using the existing tags, your question can be more easily found by others who are interested in a given topic, and through searches. – Nick Alger Aug 05 '16 at 06:34
  • Besides the question, what are your motivations and ideas about the subject ? – Jean Marie Aug 05 '16 at 06:36
  • @JeanMarie I have edited the question again. Please check it – Oily Aug 05 '16 at 07:49
  • Singular values of $A^TA$ are indeed the squares of the singular values of $A$ as you have shown, if $A$ is square. If $A$ is rectangular, the first many singular values are the same, but there may be some extra singular values of zero to make the shapes of the matrices consistent. For the inverse, you can just substitute in the SVD you already have: $(V \Sigma^2 V^T)^{-1}$, then simplify. – Nick Alger Aug 05 '16 at 16:29

1 Answers1

1

Finding the singular value decomposition

Start with a matrix with $m$ rows, $n$ columns, and rank $\rho$: $$ \mathbf{A} = \mathbb{C}^{m \times n}_{\rho}. $$ Since the question concerns the normal equations, let's fix $\rho = m$ and $m\ge n$. The matrix $\mathbf{A}$ is tall and has full column rank.

The singular value decomposition is $$ \begin{align} \mathbf{A} &= \mathbf{U}\, \mathbf{\Sigma} \, \mathbf{V}^{*} \\ &= \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}\left( \mathbf{A} \right)}} & \color{red}{\mathbf{U}_{\mathcal{N}\left( \mathbf{A}^{*} \right)}} \end{array} \right] % \left[ \begin{array}{c} \mathbf{S} \\ \mathbf{0} \end{array} \right] % \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}}^{*} % \end{align} $$ The coloring distinguishes $\color{blue}{range}$ spaces from $\color{red}{null}$ spaces. The diagonal matrix of singular values, $\mathbf{S}\in\mathbb{R}^{\rho\times\rho}$ is $$ \mathbf{S}_{k,k} = \sigma_{k}, \quad k=1,\rho. $$

Manipulating the singular value decomposition

The Moore-Penrose pseudoinverse is $$ \begin{align} \mathbf{A}^{\dagger} &= \mathbf{V}\, \Sigma^{\dagger} \mathbf{U}^{*} \\ % &= % \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}} % \left[ \begin{array}{cc} \mathbf{S}^{-1} & \mathbf{0} \end{array} \right] % \left[ \begin{array}{l} \color{blue}{\mathbf{U}_{\mathcal{R} \left( \mathbf{A} \right)}}^{*} \\ \color{red}{\mathbf{U}_{\mathcal{N} \left( \mathbf{A}^{*} \right)}}^{*} \end{array} \right]. % \end{align} $$

The Hermitian conjugate is $$ \begin{align} \mathbf{A}^{*} &= \mathbf{V}\, \Sigma^{\mathrm{T}} \mathbf{U}^{*} \\ % &= % \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}} % \left[ \begin{array}{cc} \mathbf{S} & \mathbf{0} \end{array} \right] % \left[ \begin{array}{l} \color{blue}{\mathbf{U}_{\mathcal{R} \left( \mathbf{A} \right)}}^{*} \\ \color{red}{\mathbf{U}_{\mathcal{N} \left( \mathbf{A}^{*} \right)}}^{*} \end{array} \right]. % \end{align} $$

Resolving the product matrix

The product matrix has a simple expression: $$ \begin{align} \mathbf{A}^{*} \mathbf{A} &= \left( \mathbf{V} \, \Sigma^{\mathrm{T}} \mathbf{U}^{*} \right) \left( \mathbf{U} \, \Sigma \mathbf{V}^{*} \right) \\ % &= % \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}} \, \mathbf{S}^{2} \, \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}}^{*}. % \end{align} $$

The pseudoinverse of the product matrix is then $$ \begin{align} \left( \mathbf{A}^{*} \mathbf{A} \right)^{-1} &= % \left( \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}} \, \mathbf{S}^{-2} \, \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}}^{*} \right)^{-1} \\[3pt] % &= % \left( \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}}^{*} \right)^{-1} \left( \mathbf{S}^{2} \right)^{-1} \left( \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}} \right)^{-1} \\[3pt] % &= % \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}} \, \mathbf{S}^{-2} \, \color{blue}{\mathbf{V}_{\mathcal{R}\left( \mathbf{A}^{*} \right)}}^{*}. % \end{align} $$

dantopa
  • 10,342