5

Consider a matrix-valued function $A(t):\mathbb{R}_+ \to \mathbb{R}^{n \times n}$. Suppose that each element $a_{ij}(t)$ is a smooth function with bounded derivative, e.g. $a_{ij}(t)=A_{ij}\sin(\omega_{ij} t)$. Define $f(t)$ as a minimum by absolute value eigenvalue of $A(t)$: $$f(t) = \min_i |\lambda_i\{A(t)\}|.$$

Is it true that $f(t)$ is continuous differentiable and has bounded derivative? If so, then how it can be proven, or in which book/paper it can be found?

Update

Ok, $f(t)$ is continuous, but probably not differentiable. I have revised my main problem and I see that I can alleviate the question. So, now I need $f(t)$ to be Lipschitz.

Actually, what I really need it to show that if for some $t_0 \in [t_a,t_b]$ all eignvalus of $A(t)$ are nonzero, then $$\int_{t_a}^{t_b}\left(\det\{A(s)\}\right)^2ds>0.$$ My intention was to use $\left(\det\{A(t)\}\right)^2=\prod_i^n|\lambda_i\{A(t)\}|^2\ge \left(\min_i |\lambda_i\{A(t)\}|\right)^{2n} = f(t)^{2n}$.

Note also that all $a_{ij}(t)$ are bounded.

Arastas
  • 2,329
  • Consider $A(t)=\begin{pmatrix} t &0\ 0 &1\end{pmatrix}.$ Note that $f(t)=\begin{cases}t, &0<t<1;\ 1, & t\ge 1;\end{cases}$ is not differentiable at $t=1.$ – mfl Aug 30 '17 at 13:21
  • 1
    I doubt that this is true, since eigenvalues are not stable under perturbation. See my answer here after "Remark:[...]". – P. Siehr Aug 30 '17 at 13:21
  • If you follow a specific eigenvalue / -vector (whatever that means) or look at the entire collection of eigenvalues / -vectors as one (whatever that means), then at least it's a little more well-behaved. – Arthur Aug 30 '17 at 13:52
  • Next time ask "what you really want to know" up front - or at least as part of the more general question. See https://meta.stackexchange.com/questions/66377/what-is-the-xy-problem – Ethan Bolker Aug 30 '17 at 14:05
  • 1
    For Lipschitz, see P. Siehr's link. The smallest eigenvalue of $\begin{bmatrix} 0 & \varepsilon \ 1 & 0 \end{bmatrix}$ is not a Lipschitz function of $\varepsilon$. – Ian Aug 30 '17 at 14:06
  • 1
    What are $t_a$ and $t_b$? Your last question seems to be much easier than your original question. Matrix determinant is for sure continuous function of matrix elements, and the absence of zero eigenvalues means that at some point the determinant is non-zero and hence its square is greater than zero. And because of contuniuity of determinant this holds in some neighbourhood of that point. – Evgeny Aug 30 '17 at 14:17
  • @Evgeny, I have updated the question, sorry for the poor formulations. I guess I also have to restrict how fast the determinant can increase and decay. – Arastas Aug 30 '17 at 14:39
  • 1
    @Arastas Unless anything additional is said about $t_a$ and $t_b$, I still see no problem with using continuity around the point $t_0$ for proving your statement. – Evgeny Aug 30 '17 at 14:49
  • @Evgeny, yes, I think you a right. Thanks! – Arastas Aug 30 '17 at 14:57

2 Answers2

4

Differentiable is hopeless. The minimum and absolute value are not differentiable, so this is a lot to expect. Example: $$ \begin{pmatrix} \sin t & 0 \\ 0 & \cos t \end{pmatrix} \text{.} $$

We plot the minimum eigenvalue for this matrix

Minimum Eigenvalue

as well as the absolute values of the eigenvalues (tracking which is which by color).

Mathematica graphics

Update :

From Eigenvalues of matrix with entries that are continuous functions , the eigenvalues of your matrix are continuous functions of $t$. Consequently, since the $\lambda_i\{A(t_0)\} \neq 0$ simultaneously, your product, $\prod_{i=1}^n |\lambda_i\{A(t)\}|^2$ is positive on a neighbourhood of $t_0$.

Alternatively, the determinant is a continuous function of the entries, which are continuous functions, so the determinant is a continuous function of $t$. Since the determinant is the product of the eigenvalues, which are simultaneously nonzero at $t_0$, the determinant is nonzero on a neighbourhood of $t_0$.

Either way, the integral you write has an everywhere nonnegative integrand and we have shown that it is positive on an interval, so we have shown that the integral is positive.

Update 2 :

I have to go do other work. But this last version seems likely. The determinant is a quasilinear (no variable appears with power $ >1 $ in any term) polynomial in the entries of the matrix, so the derivative of the determinant will be some magnificently huge polynomial in the zeroeth and first derivatives of the matrix entries...

Eric Towers
  • 67,037
2

No. Absolute value is not differentiable at 0. For example, take $n=1$ and $A (t)=t$.

Ron
  • 276