2

I am trying to understand notation which I can not find the explanation for in the book... It says: Since $$ (f_1(\mathbf{x}), \ldots, f_n(\mathbf{x})) = A (\mathbf{x} - \mathbf{x}_0) + O(|\mathbf{x} - \mathbf{x}_0 |^2) $$ we have $$ |\mathbf{x} - \mathbf{x}_0| \leq \| A^{-1} \| (|(f_1(\mathbf{x}), \ldots, f_n(\mathbf{x}))| + C|\mathbf{x} - \mathbf{x}_0 |^2 ). $$ Here $A$ is an invertible real matrix, $|\cdot |$ denotes $L^2$ norm on $\mathbb{R}^n$ and $\mathbf{x}_0$ is a fixed vector. What exactly does $\| A^{-1}\|$ mean in this context? Thank you for the clarification.

Johnny T.
  • 2,897

2 Answers2

1

Most likely the induced (operator or 2 norm) norm arising from the $L^2$ norm on vectors,

$\left| B \right|=\underset{\left| x \right|=1}{\mathop{\sup }}\,\left| Bx \right|$

It can be found explicitly as the square root of the largest eigenvalue of of $B^TB$ if B is a matrix.

Paul
  • 1,482
1

I am sure that Hörmander has defined which matrix norm $\lVert B \rVert$ he uses in his book.

But certainly any matrix norm satisfying $$\lvert B \cdot \mathbf x \rvert \le \lVert B \rVert \cdot \lvert \mathbf x \rvert \tag{1}$$ gives the desired result. Condition $(1)$ says that the matrix norm $\lVert - \rVert$ is consistent with the vector norm $\lvert - \rvert$.

Given any vector norm, the operatornorm $$\lVert B \rVert = \sup \left\{ \frac{\lvert B \cdot \mathbf x \vert}{\lvert \mathbf x \vert} \mid \mathbf x \ne 0 \right\}$$ always has this property.

However, there are many other examples of consistent matrix and vector norms. I guess that Hörmander uses the Euclidean norm (aka Frobenius norm) $$\lVert B \rVert = \sqrt{\sum_{i,j} b_{ij}^2} .$$

This norm is submultiplicative which means that $$\lVert B \cdot C \rVert \le \lVert B \rVert \cdot \lVert C \rVert. $$ See here.

Taking any matrix norm, we can define an associated vector norm by $$\lvert \mathbf x \rvert_{ass} = \left \lVert M(\mathbf x,n) \right \rVert .$$ Here we regard $\mathbf x$ as a column vector and form the matrix $M(\mathbf x,n)$ having $\mathbf x$ in all $n$ columns.

If the matrix norm is submultplicative, then it is clearly consistent with its associated vector norm.

Working with the Euclidean matrix norm gives $$\lvert \mathbf x \rvert_{ass} = \sqrt n \lvert \mathbf x \rvert .$$ Therfore $$\lvert B \cdot \mathbf x \rvert = \frac{1}{\sqrt n} \lvert B \cdot \mathbf x \rvert_{ass} \le \frac{1}{\sqrt n} \lVert B \rVert \cdot \lvert x \rvert_{ass} = \lVert B \rVert \cdot \lvert x \rvert .$$

Paul Frost
  • 76,394
  • 12
  • 43
  • 125