0

Given the equation \begin{equation}\det(\lambda^2 I+ B \lambda +K )=0\end{equation} where $I,B,K\in \mathbb R^{m \times m}$. $B$ and $K$ are symmetric matrices with no zero eigenvalues and $B>0$. Let $n^+(A)$ and $n^-(A)$ denote the number of eigenvalues of $A$ with positive real parts and negative real parts respectively. Is the number of roots of the above equation with positive real parts equal to $n^-(B)+n^-(K)$ and the number of roots with negative real parts equal to $n^+(B)+n^+(K)$?

My trial was as follows: let $v \in \mathbb{ R^m}$ if $K$ is positive definite then $$\det(\lambda^2 I+ B \lambda +K )=0 \implies \exists v: v^T(\lambda^2 I+ B \lambda +K)v=0 \implies \lambda^2 v^2+ v^TBv \lambda +v^TKv=0.$$ All the coefficients are positive, so real part of $\lambda$ must be negative. The same argument can be used when $K<0$, the remaining case is when $K$ is indefinite.


Edit

As Ben Grossmann showed the system can be seen as the characteristic equation of: $$ J = \pmatrix{0&&I \\ -K && -B}. $$ where $B>0$, $J$ cannot have an eigenvalue with purely imaginary part if it did then $ \exists v: v^H((bi)^2 I+ B (bi) +K)v=0 \implies v^HBvb=0 \iff b=0$

Let $n_0(A)$ denote the number of zero eigenvalues of A.

lemma 1: $n_0(K)=n_0(J)$

we observe that the reduced echelon form of $J$ is: $$ J_{red} = \pmatrix{I&&0\\0 && K}. $$ so $n_0(K)=n_0(J)$.

Let $G(t)=K+t I$ and let $$ J(t)=\pmatrix{0&&I\\-G(t) &&-B} $$ observe that as $t\rightarrow \infty$,$G(t)>0 \implies n_-(J(t))=2m$ and as $t\rightarrow -\infty$ , $G(t)<0 \implies n_-(J(t))=n_+(J(t))=m$. Let the eigenvalues of $J(t)$ be $\{\lambda_i(t)\}$.

As $t$ goes from $-\infty$ to $+\infty$, the eigenvalues of $G(t)$ become zero $m$ times which is when they change signs. By lemma 1, the eigenvalues of $J(t)$ must also become zero $m$ times and those times are exactly when eigenvalues of $G(t)$ are zero. It is not difficult to see that the eigenvalues $J(t)$ are zero exactly when one of the positive eigenvalues change signs to negative, if not then one of the positive eigenvalues of $J(t)$ won't change signs and would remain positive as $t \rightarrow +\infty$ which is a contradiction.

So every time an eigenvalue of $G(t)$ changes signs from negative to positive, an eigenvalue of $J(t)$ will change signs from positive to negative, so we have that $\forall t$ , $n_-(J(t))=m+n_+(G(t))=n_+(B)+n_+(G(t))$, taking $t=0$ yields $n_-(J)=n_+(B)+n_+(K)$ as required.

Is there something wrong with this proof?

abc1455
  • 467
  • 2
    It might be helpful to note that $\det(\lambda^2 I + B\lambda + K) = 0$ holds iff $\lambda$ is an eigenvalue of the block matrix $$ M = \pmatrix{B & -K\I & 0} $$ – Ben Grossmann Dec 31 '20 at 22:27
  • 1
    If $K$ is positive definite, the block-matrix $M$ is similar to the potentially nicer matrix $$ \pmatrix{I\ & K^{1/2}} \pmatrix{B&-K\I&0} \pmatrix{I\ & K^{1/2}}^{-1} = \pmatrix{B & -K^{1/2}\ K^{1/2} & 0}. $$ With that, we can at least see that $M$ will only have eigenvalues with non-negative real part when $B$ and $K$ are positive definite. – Ben Grossmann Dec 31 '20 at 22:36
  • 1
    If $K$ is negative definite, a similar process results in a new matrix that is symmetric. So if $K$ is negative definite, then all eigenvalues of $M$ must be real and the number of negative eigenvalues depends on the definiteness of the Schur complement of $B$ within the resulting block matrix. – Ben Grossmann Dec 31 '20 at 22:40
  • What about the case when $K$ is indefinite? – abc1455 Dec 31 '20 at 22:49
  • No idea, that's all I've been able to come up with. Also, my result for negative definite $K$ seems to disagree with what you've said about the commuting case. – Ben Grossmann Dec 31 '20 at 22:58
  • you are correct – abc1455 Dec 31 '20 at 23:00
  • @BenGrossmann I edited my answer, Do you think its a valid proof? – abc1455 Jan 02 '21 at 09:15
  • Interesting! Your proof definitely works in the case that $K$ has distinct eigenvalues. For the case of repeating eigenvalues, there aren't enough sign-changes to guarantee that we have the same number of positive/negative eigenvalues for each value of $t$, but I think this case can be handled using limits and continuity (i.e. the continuous dependence of the eigenvalues of a matrix on its entries). – Ben Grossmann Jan 02 '21 at 13:44
  • @BenGrossmann I think it also works for the case of repeated eigenvalues, notice that lemma 1 says if $n$ eigenvalues of $K$ are zero then $n$ eigenvalues of $J$ are also zero, so the number of sign changes is counted with multiplicity. – abc1455 Jan 02 '21 at 19:07
  • It is possible for an eigenvalue to become zero without changing sign. However, your proof works because in order for an eigenvalue to change sign, it must become zero. – Ben Grossmann Jan 02 '21 at 19:20
  • @BenGrossmann indeed if one eigenvalue became zero without changing signs then there would remain a positive eigenvalue that doesn't change signs which is a contradiction to the fact that all positive eigenvalues must change sign. – abc1455 Jan 02 '21 at 19:22
  • I see, we have an upper bound on the total number of sign changes because of your statement regarding the multiplicity of the zero eigenvalue. So yes, I think your proof works in the general case. – Ben Grossmann Jan 02 '21 at 19:25

1 Answers1

1

A partial (but non-trivial) answer: if $K$ is negative definite, the number of positive/negative roots can be determined as follows.

First, observe that $\det(\lambda^2 I + \lambda B + K) = \det(\lambda I - M)$, where $$ M_1 = \pmatrix{-B & -K\\I & 0}. $$ This holds regardless of the definiteness of $K$. If $K$ is negative definite, then there exists a unique positive definite square-root $P$ of $-K$ (i.e. $P^2 = -K$). We note that the above matrix $M_1$ is similar to $$ M_2 = \pmatrix{I\\ & P} \pmatrix{-B&-K\\I & 0}\pmatrix{I \\ & P}^{-1} = \pmatrix{-B & P\\P & 0}. $$ The above matrix is symmetric and thus has real eigenvalues. Furthermore, since $B$ is invertible, we have $$ n_\pm(M_2) = n_{\pm}(-B) + n_{\pm}(C), $$ where $C$ denotes the Schur complement $$ C = M_2/(-B) = PB^{-1}P. $$ Note that $n_{\pm}(PB^{-1}P) = n_{\pm}(B^{-1}) = n_{\pm}(B)$. With that, we can conclude that regardless of the definiteness of $B$, the number of roots with positive real part and negative real part are both equal to $m$.


For the case where $K$ is positive definite, there is a positive definite (and symmetric) $P$ for which $K = P^2$. Applying the same manipulation as before leaves us with the matrix $$ M_2 = \pmatrix{-B & -P\\ P & 0}. $$ We note that the symmetric part of this matrix, $$ M_S = \frac 12(M_2 + M_2^T) = \pmatrix{-B&0\\0 & 0}, $$ is negative semidefinite whenever $B$ is positive definite. It follows that $M_2$ will only have eigenvalues with non-positive real part in the case that $B$ is positive definite.


Thoughts on the general case:

For the general case, with $B$ positive definite. Let $P$ be such that $PKP^T$ is of the form $\operatorname{diag}(I_{n_+},-I_{n_-})$.

$$ M_2 = \pmatrix{P\\& P^{-T}} \pmatrix{-B & -K\\ I & 0} \pmatrix{P \\ & P^{-T}}^{-1} = \pmatrix{-PBP^{-1} & -PKP^T\\ (PP^T)^{-1} & 0} $$

Ben Grossmann
  • 225,327
  • It seems that either the question or my answer is wrong since this result does not agree with the expected $n_-(M) = n_-(B) + n_-(K) = m+n_-(B)$. – Ben Grossmann Dec 31 '20 at 22:55
  • no, your answer is correct, my question should have been if $B$ is positive definite – abc1455 Dec 31 '20 at 23:01
  • @abc1455 Thanks for verifying. For the general case, I would suspect that we could get somewhere by applying Sylvester's law of intertia to $K$ and in that way somehow combine the observations for the positive and negative definite cases. – Ben Grossmann Dec 31 '20 at 23:08
  • I tried to simultaneously diagonalize $B$ and $K$ but the problem remained. The equation would be $det(\lambda^2 SS^T+I \lambda + \Lambda_K)=0$ where $\Lambda_K$ is diagonal, I then applied Sylvester's law of inertia on $\Lambda_K$ however, I couldn't continue because of the term $S S^T$ – abc1455 Dec 31 '20 at 23:14
  • @abc1455 It's definitely tricky. By the way, your conclusion for $K$ positive definite is the opposite of what it should be: a quadratic with positive coefficients has roots with negative real part. – Ben Grossmann Dec 31 '20 at 23:23