4

Let $A\in R^{n\times n}$ with full SVD $U\Sigma V^T$ where $U$ and $V$ are orthogonal $n\times n$ matrices and $\Sigma$ is an $n\times n$ diagonal matrix with entries $\sigma_1 \geq\cdots\geq \sigma_n \geq 0$.

1- What is the SVD of $A^{-1}$ ?
2- Given that ||A|| = $\sigma_1$, how would we express ||$A^{-1}$|| in terms of the singular values of A?
3- What is the condition number of A?

Joe
  • 381

2 Answers2

9

Hint:

If $A = U\Sigma V^T$, then $$ A^{-1} = (U\Sigma V^T)^{-1} = (V^T)^{-1} \Sigma^{-1} U^{-1} $$ Keep in mind that if $A$ is invertible, then $\sigma_i > 0$ for each $i$. For 3, you should find that the condition number for $A^{-1}$ is identical to that of $A$ under the norm $\|A\| = \sigma_1(A)$.

Ben Grossmann
  • 225,327
  • 1
    Did my post help you answer your question? If so, please accept my answer by clicking the checkmark ($\checkmark$) next to my question. If not, feel free to comment on my question if you have further questions. Do you understand the hint? Do you understand why I think it's useful? Have you tried anything to get the answer here so far? What is it about this question that has you stumped? – Ben Grossmann Aug 01 '14 at 22:31
6

The answer is posed as a special case of a general problem.

$(1)$ Singular Value Decomposition

Every matrix $$ \mathbf{A} \in \mathbb{C}^{m\times n}_{\rho} $$ has a singular value decomposition

$$ \begin{align} \mathbf{A} &= \mathbf{U} \, \Sigma \, \mathbf{V}^{*} \\ % &= % U \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}} \end{array} \right] % Sigma \left[ \begin{array}{cc} \mathbf{S}_{\rho\times \rho} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \end{array} \right] % V* \left[ \begin{array}{c} \color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} \\ \color{red}{\mathbf{V}_{\mathcal{N}}}^{*} \end{array} \right] \\ % % &= % U \left[ \begin{array}{cc} \color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}} \end{array} \right] % Sigma \left[ \begin{array}{cccc|cc} \sigma_{1} & 0 & \dots & & & \dots & 0 \\ 0 & \sigma_{2} \\ \vdots && \ddots \\ & & & \sigma_{\rho} \\\hline & & & & 0 & \\ \vdots &&&&&\ddots \\ 0 & & & & & & 0 \\ \end{array} \right] % V \left[ \begin{array}{c} \color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} \\ \color{red}{\mathbf{V}_{\mathcal{N}}}^{*} \end{array} \right] \\ % & = % U \left[ \begin{array}{cccccccc} \color{blue}{u_{1}} & \dots & \color{blue}{u_{\rho}} & \color{red}{u_{\rho+1}} & \dots & \color{red}{u_{n}} \end{array} \right] % Sigma \left[ \begin{array}{cc} \mathbf{S} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \end{array} \right] % V \left[ \begin{array}{c} \color{blue}{v_{1}^{*}} \\ \vdots \\ \color{blue}{v_{\rho}^{*}} \\ \color{red}{v_{\rho+1}^{*}} \\ \vdots \\ \color{red}{v_{n}^{*}} \end{array} \right] % \end{align} $$

The $\rho$ singular values are ordered and satisfy $$ \sigma_{1} \ge \sigma_{2} \ge \dots \sigma_{\rho} > 0 $$

The column vectors are orthonormal basis vectors: $$ \begin{align} % R A \color{blue}{\mathcal{R} \left( \mathbf{A} \right)} &= \text{span} \left\{ \color{blue}{u_{1}}, \dots , \color{blue}{u_{\rho}} \right\} \\ % R A* \color{blue}{\mathcal{R} \left( \mathbf{A}^{*} \right)} &= \text{span} \left\{ \color{blue}{v_{1}}, \dots , \color{blue}{v_{\rho}} \right\} \\ % N A* \color{red}{\mathcal{N} \left( \mathbf{A}^{*} \right)} &= \text{span} \left\{ \color{red}{u_{\rho+1}}, \dots , \color{red}{u_{m}} \right\} \\ % N A \color{red}{\mathcal{N} \left( \mathbf{A} \right)} &= \text{span} \left\{ \color{red}{v_{\rho+1}}, \dots , \color{red}{v_{n}} \right\} \\ % \end{align} $$

Moore-Penrose Pseudoinverse Matrix

$$ \begin{align} \mathbf{A}^{+} &= \mathbf{V} \, \Sigma^{+} \mathbf{U}^{*} \\ % &= % V \left[ \begin{array}{cc} \color{blue}{\mathbf{V}_{\mathcal{R}}} & \color{red}{\mathbf{V}_{\mathcal{N}}} \end{array} \right] % Sigma \left[ \begin{array}{cc} \mathbf{S}^{-1} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} \end{array} \right] % % U \left[ \begin{array}{c} \color{blue}{\mathbf{U}_{\mathcal{R}}}^{*} \\ \color{red}{\mathbf{U}_{\mathcal{N}}}^{*} \end{array} \right] \\ % \end{align} % $$

Special Case: $m=n=\rho$

Your question is about the special case of a square matrix with full rank. In this instance the pseudoinverse is equivalent to the classic inverse: $$ \mathbf{A}^{+} = \mathbf{A}^{-1} $$

Both null spaces are trivial: $$ \color{red}{\mathcal{N} \left( \mathbf{A} \right)} = \color{red}{\mathcal{N} \left( \mathbf{A}^{*} \right)} = \mathbf{0} $$

The SVD is $$ \mathbf{A} = % \color{blue}{\mathbf{U}_{\mathcal{R}}}\, % Sigma \mathbf{S} \, % V \color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} % $$ and the inverse is $$ \mathbf{A}^{+} = % \color{blue}{\mathbf{V}_{\mathcal{R}}}\, % Sigma \mathbf{S}^{-1} \, % V \color{blue}{\mathbf{U}_{\mathcal{R}}}^{*} % = \left( \color{blue}{\mathbf{U}_{\mathcal{R}}}\, % Sigma \mathbf{S} \, % V \color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} \right)^{-1} % = \mathbf{A}^{-1} % $$


More about classification of matrix inverses: What forms does the Moore-Penrose inverse take under systems with full rank, full column rank, and full row rank?


$(2)$ Matrix Norm

$$ \lVert \mathbf{A} \rVert_{2} = \sigma_{1}, \qquad \Rightarrow \qquad \lVert \mathbf{A}^{-1} \rVert_{2} = \frac{1}{\sigma_{\rho}} $$

$(3)$ Condition Number

$$ \kappa_{p} = \lVert \mathbf{A}^{-1} \rVert_{p} \lVert \mathbf{A}_{p} \rVert \qquad \Rightarrow \qquad \kappa_{2} = \lVert \mathbf{A}^{-1} \rVert_{2} \lVert \mathbf{A} \rVert_{2} = \frac{\sigma_{1}}{\sigma_{\rho}} $$

Sycorax
  • 100
dantopa
  • 10,342