0

Let $A$ be a $2\times 2$, symmetric positive definite matrix such that its smallest eigenvalue $\lambda$ is bounded below by $\lambda_0>0$.

Why does the bound on $\lambda$ implies that, for any $v\in \mathbb R^2$, $$\lVert A^{-1}v\rVert\leq \frac{1}{\lambda_0}\lVert v\rVert ?$$

This is natural to me if in particular $v$ were an eigenvector. In this case, $A^{-1}v=v/\lambda$ and then we get the result by taking the Euclidean norm on both sides.

3 Answers3

2

If $\lambda_i,i\in\{1,2\}$ are eigenvalues of $A$, then matrix $(A^{-1})^2$ has $1/\lambda_i^2,i\in\{1,2\}$ as eigenvalues. Also consider the diagonalization $(A^{-1})^2=QLQ'$.

Then

\begin{align} \lVert A^{-1}v\rVert^2&=v'(A^{-1})^2v=(Q'v)'L(Q'v)\\ &=\sum_i \frac{1}{\lambda_i^2}(q_i' v)^2\\ &\leq \frac{1}{\lambda_0^2}\sum_i (q_i' v)^2=\frac{1}{\lambda_0^2}\lVert v\rVert^2 \end{align}

Take the square root on both sides and we are done. (Thanks for all the answers)

1

We will prove this for any $n\times n$ invertible, positive definite matrix $A$. Since $A^{-1}$ is also a positive definite matrix, with eigenvalues corresponding to the reciprocals of the original's, your statement is equivalent to proving: $$||A\textbf{x}||\leq \lambda_{\text{max}}||\textbf{x}||$$

Where $\lambda_{\text{max}}$ is the largest eigenvalue of $A$.

We will use the fact that, for any symmetric matrix, there exists of basis of $\mathbb{R}^{n}$ with only mutually perpendicular unit eigenvectors (an orthonormal eigenbasis). This is not difficult to prove; see here. For $A$, let such a basis be $\{\textbf{v}_{1}, \textbf{v}_{2},\dots,\textbf{v}_{n}\}$, corresponding to (possibly repeated) eigenvalues $\lambda_{1},\lambda_{2},...,\lambda_{n}$.

Consider some vector $\textbf{x} = c_{1}\textbf{v}_{1} + c_{2}\textbf{v}_{2} + \dots + c_{n}\textbf{v}_{n}$. Then:

$$\begin{align} A\textbf{x}& = A(c_{1}\textbf{v}_{1} + c_{2}\textbf{v}_{2} + \dots + c_{n}\textbf{v}_{n}) \\ &= c_{1}A\textbf{v}_{1} + c_{2}A\textbf{v}_{2} +\dots+c_{n}A\textbf{v}_{n} \\ &= c_{1}\lambda_{1}\textbf{v}_{1} + c_{2}\lambda_{2}\textbf{v}_{2} + \dots + c_{n}\lambda_{n}\textbf{v}_{n}\end{align}$$

Since the basis is orthonormal:

$$\begin{align} ||A\textbf{x}||&= \sqrt{(c_{1}\lambda_{1})^{2} + (c_{2}\lambda_{2})^{2} + \dots + (c_{n}\lambda_{n})^{2}} \\ &\leq \sqrt{\lambda_{\text{max}}^{2}(c_{1}^{2} + c_{2}^{2} + \dots + c_{n}^{2})}\\ &= \lambda_{\text{max}} \sqrt{c_{1}^{2}+c_{2}^{2}+\dots+c_{n}^{2}}\end{align}$$

Then, since $\textbf{x} = c_{1}\textbf{v}_{1} + c_{2}\textbf{v}_{2} + \dots + c_{n}\textbf{v}_{n}$, we have $||\textbf{x}|| = \sqrt{c_{1}^{2} + c_{2}^{2} + \dots + c_{n}^{2}}$, so we can conclude that $||A\textbf{x}||\leq \lambda_{\text{max}}||\textbf{x}||.\ \blacksquare$

Note that this inequality does not always hold for non-symmetric matrices, even if they do have an eigenbasis. For instance, consider:

$$C = \begin{bmatrix} 0 & \frac{1}{r} \\ r & 0\end{bmatrix}$$

$C$ has eigenvalues of $\pm 1$, so an eigenbasis exists. However, $\left|\left|C\begin{bmatrix} 1 \\ 0\end{bmatrix}\right|\right| = r$, so for $r>1$ the inequality is violated.

Joshua Wang
  • 6,103
0

Because $A$ is SPD, its singular values are equal to its eigenvalues. The singular values of $A^{-1}$ are the reciprocals of the singular values of $A$ (show this). The matrix norm is equal to its largest singular value. Let $\lambda_0$ be the smallest eigenvalue of $A$. Then, $1/\lambda_0$ is the largest eigenvalue of $A^{-1}$. Hence, $$||A^{-1}v|| \leq ||A^{-1}||||v|| = \frac{1}{\lambda_0}||v||.$$

Doug
  • 2,436