3

Suppose that we are dealing with positive semidefinite $n\times n$ symmetric real matrices.

The induced matrix norm of $A$ is defined as $$ |A|=\max\lambda(A)=\max_{x\in\mathbb{R^n},|x|=1}x'Ax. $$ Here, $\lambda(A)$ denotes the set of eigenvalues of $A$. Is it true that this matrix norm is continuous: if $A_n\to A$ (component-wise), then $|A_n|\to |A|$?

I vaguely imagine that since eigenvalues are solutions to the characteristic polynomial, which are built using entries of the matrix, continuity is likely to hold. But I would love to see a rigorous proof (or a counter example).

Thank you very much. I apologize ahead if this is a duplicate. (I did search.)

Kim Jong Un
  • 14,794
  • 1
  • 24
  • 49

3 Answers3

6

It is maybe a too big cannon to use (or prove) the continuity of eigenvalues. It is well known that the (matrix) norms are continuous. This follows from the triangle inequality: $$ \|A\|\leq\|B\|+\|A-B\|, \quad \|B\|\leq\|A\|+\|A-B\|\quad\Rightarrow \quad |\|A\|-\|B\||\leq\|A-B\|. $$ Therefore, if $A$ and $B$ are close then $\|A\|$ and $\|B\|$ are close as well.

Then use the fact that $\|A\|=\rho(A)$ when $\|\cdot\|:=\|\cdot\|_2$ is the spectral norm and $A$ is SPD.

You can relate the componentwise closeness to the normwise closeness by realizing that $$ \|A\|_2=\max_{\|x\|_2=1}\|Ax\|_2\leq n\max_{1\leq i,j\leq n}|a_{ij}|=:n\|A\|_{\max} $$ (or by simply showing that $\|A\|_{\max}$ is a norm and use the fact that norms on a finite-dimensional space are equivalent). Therefore, if $|a_{ij}-b_{ij}|<\epsilon$ for all $i,j=1,\ldots,n$ and some $\epsilon>0$, we have $|\|A\|_2-\|B\|_2|<\delta$ with $\delta:=\epsilon/n$.

1

Yes, it is continuous.

Any symmetric matrix can be diagonalized by an orthogonal matrix, which means there is an orthonormal basis of eigenvectors. The matrix itself then corresponds to scaling each eigenvector by its corresponding eigenvalue (non-eigenvectors can be decomposed into components parallel to each of the eigenvectors; each component then scales the same as the corresponding eigenvector).

Explicitly, the matrix itself can be written as a sum of rank-1 matrices, each one an outer product of one of the eigenvectors with itself, multiplied by the corresponding eigenvalue. Changing the matrix by a small amount, whilst keeping it symmetric, corresponds to changing the directions of some of the eigenvectors (and hence the outer products), and/or the eigenvalues.

This is not very rigorous, but hopefully it makes it a bit more intuitive why the induced norm is continuous on symmetric matrices.

1

Hint: You might use the fact that $A$ belongs to a $\frac{n(n+1)}{2}$-dimensional Riemannian manifold. Then you might find some appropriate metric there ;)

nullgeppetto
  • 3,006