4

I am trying to prove the following statement:

Let $A,B\in M(n,\mathbb{R})$.

If $B$ is positive definite and $(A-B)$ is non-negative definite, then $\det(A-\lambda B)=0$ has all its roots $\lambda\geqslant1$ and conversely, if all roots $\lambda\geqslant 1$, then $(A-B)$ is non-negative definite.

If $(A-B)$ is n.n.d and $B$ is p.d, then I have $x^\top(A-B)x\geqslant0$ for all $x\in\mathbb{R}^n$.

This implies $x^\top Ax\geqslant x^\top Bx>0$ for all $x\ne0$, so that $A$ is p.d.

Moreover, as $B$ is p.d, $B$ is nonsingular.

So, $\det(A-\lambda B)=0\implies\det((AB^{-1}-\lambda I)B)=0$

$\qquad\qquad\qquad\qquad\quad\implies\det(AB^{-1}-\lambda I)=0$ , as $\det(B)\ne0$

Thus $\lambda$ is an eigenvalue of the matrix $AB^{-1}$.

Now for the eigenvector $x\ne0$ corresponding to $\lambda$ we have,

$(AB^{-1})x=\lambda x\implies(AB^{-1}-I)x=(\lambda-1)x$.

If I can show that $AB^{-1}-I$ is n.n.d given that both $A$ and $B$ are p.d, then that would possibly

imply $\lambda-1\geqslant0$ and I am done. But I am not sure if this is true or not.

In a different approach using the fact that a p.d matrix can be expressed as $D^\top D$ for some nonsingular matrix $D$, I was able to show that $AB^{-1}-I=PQ$ for some n.n.d matrix $P$ and p.d matrix $Q$. Does that help me conclude that $AB^{-1}-I$ is indeed n.n.d?

Any simpler or alternate approach is welcome.

StubbornAtom
  • 17,052

5 Answers5

2

By shifting $A$ and $\lambda$, you want to show that if $B$ is positive definite, then $A$ is nonnegative definite iff all the roots of $\det(A-\lambda B)$ are nonnegative.

Since $B$ is positive definite, it admits a Cholesky factorization $B=LL^{\dagger}$ where $L$ and $L^{\dagger}$ are invertible. Now, the roots of $\det(A-\lambda B)=\det(A-\lambda LL^{\dagger})$ are precisely the roots of $\det(\lambda-L^{-1}A(L^{\dagger})^{-1})$, and $A$ is nonnegative definite iff $C\stackrel{\text{def}}{=}L^{-1}A(L^{\dagger})^{-1}$ is. Consequently, it suffices to show that $C$ is nonnegative definite iff all the roots of its characteristic polynomial $\det(\lambda-C)$ are at least zero. The roots of the polynomial are exactly the eigenvalues of $C$, so we need to know that $C$ is nonnegative definite iff all its eigenvalues are at least zero. I think this last equivalence is a standard result achieved by considering the quadratic form $x^{\dagger}Cx$ in $C$'s diagonalizing basis.

K B Dave
  • 7,912
2

We assume that $A,B$ are real symmetric and $B>0$.

$A-B\geq 0\Leftrightarrow B^{-1/2}AB^{-1/2}\geq I\Leftrightarrow inf(spectrum(B^{-1/2}AB^{-1/2}))\geq 1$.

$\det(A-\lambda B)=0\Leftrightarrow \det(B^{-1/2}AB^{-1/2}-\lambda I)=0\Leftrightarrow \lambda\in spectrum(B^{-1/2}AB^{-1/2})$.

Finally $A-B\geq 0\Leftrightarrow$ the roots of $\det(A-\lambda B)=0$ are $\geq 1$.

Note that $spectrum(B^{-1/2}AB^{-1/2})=spectrum(AB^{-1})$.

0

$$\begin{array}{rl} p (s) := \det \left( \mathrm A - s \mathrm B \right) &= \det \left( (\mathrm A - \mathrm B) - (s-1) \,\mathrm B \right)\\ &= \det \left( \mathrm B^{\frac 12} \left( \mathrm B^{-\frac 12} (\mathrm A - \mathrm B) \,\mathrm B^{-\frac 12} - (s-1) \,\mathrm I \right) \mathrm B^{\frac 12} \right)\\ &= \det \left( \mathrm B^{\frac 12} \right) \cdot \det \left( \mathrm B^{-\frac 12} (\mathrm A - \mathrm B) \,\mathrm B^{-\frac 12} - (s-1) \,\mathrm I \right) \cdot \det \left( \mathrm B^{\frac 12} \right)\\ &= \underbrace{\det \left( \mathrm B \right)}_{> 0} \cdot \underbrace{\det \left( \mathrm B^{-\frac 12} (\mathrm A - \mathrm B) \,\mathrm B^{-\frac 12} - (s-1) \,\mathrm I \right)}_{=: q (s)} \end{array}$$

Thus, $p$ and $q$ have the same roots, namely,

$$s = 1 + \underbrace{\lambda \left( \mathrm B^{-\frac 12} (\mathrm A - \mathrm B) \,\mathrm B^{-\frac 12} \right)}_{\geq 0} \color{blue}{\geq 1}$$

because if $\rm A - B \succeq O$ and $\rm B \succ O$, then $\mathrm B^{-\frac 12} (\mathrm A - \mathrm B) \,\mathrm B^{-\frac 12} \succeq \mathrm O$.

  • How do you arrive at $s=1+\lambda (B^{-1/2}(A-B)B^{-1/2})$? What is your $\lambda$? – StubbornAtom Mar 18 '18 at 19:14
  • Note that $q (s+1)$ is the characteristic polynomial of matrix $\mathrm B^{-\frac 12} (\mathrm A - \mathrm B) ,\mathrm B^{-\frac 12}$. $\lambda$ is a function that returns any eigenvalue of the input matrix. I will think of a better notation. – Rodrigo de Azevedo Mar 18 '18 at 19:22
0

Let $\langle\cdot,\cdot\rangle$ be the usual product in $\mathbb{R}^n$. First assume that $B$ is positive definite and $A-B$ is non-negative definite or equivalently positive semi-definite. This implies for all $x\in\mathbb{R}^n$ $$\langle (A-B)x,x\rangle\geqslant 0\Leftrightarrow \langle Ax,x\rangle\geqslant \langle Bx,x\rangle$$ Next let $\lambda\in\mathbb{R}$ be such that $\det(A-\lambda B)=0$ then there exists some vector $x^*\in\mathbb{R}^n\setminus\{0\}$ such that $(A-\lambda B)x^*=0$. This in turn yields $$0=\langle (A-\lambda B)x^*,x^*\rangle=\langle Ax^*,x^*\rangle-\lambda\langle Bx^*,x^*\rangle\\\geqslant\langle Bx^*,x^*\rangle-\lambda\langle Bx^*,x^*\rangle=(1-\lambda)\langle Bx^*,x^*\rangle$$ Therefore we have $$(1-\lambda)\langle Bx^*,x^*\rangle\leqslant 0\Rightarrow \lambda\geqslant 1$$ since $\langle Bx^*,x^*\rangle>0$ whenever $x^*\neq 0$. This proves one direction. Now suppose $B$ is positive definite and that $\lambda\geqslant 1$ where $\det(A-\lambda B)=0$. Let $S_{\lambda}:=\{\lambda\in\mathbb{R}:\det(A-\lambda B)=0\}$. For any $\lambda\in S_{\lambda}$ there exists some $x^*_{\lambda}\neq 0$ such that $(A-\lambda B)x^*_{\lambda}=0$ since $\lambda\neq 0$. Therefore $$0=\langle (A-\lambda B)x_{\lambda}^*,x^*_{\lambda}\rangle\Leftrightarrow \langle Ax_{\lambda}^*,x^*_{\lambda}\rangle=\lambda\langle Bx_{\lambda}^*,x^*_{\lambda}\rangle\geqslant \langle Bx_{\lambda}^*,x^*_{\lambda}\rangle$$ since by assumption any such $\lambda\geqslant 1$. So for all $x^*_{\lambda}$ we have $$\langle Ax_{\lambda}^*,x^*_{\lambda}\rangle\geqslant \langle Bx_{\lambda}^*,x^*_{\lambda}\rangle\Leftrightarrow \langle (A- B)x_{\lambda}^*,x^*_{\lambda}\rangle\geqslant 0$$ This shows that $A-B$ is positive semi-definite on the convex cone $$K:=\{x\in\mathbb{R}^n: x=\sum_{\lambda\in S_{\lambda}}\alpha_{\lambda}x^*_{\lambda}, \alpha_{\lambda}\geqslant 0\}\subseteq\mathbb{R}^n$$ I suspect this is the best you can get in this other direction. You can have a vector $x\neq 0$ such that $x\notin K$ and $(A-\lambda B)x\neq 0$ for all $\lambda\in S_{\lambda}$. Then $\langle (A-\lambda B)x,x\rangle<0 (>0)$ are both possible.

Arian
  • 6,277
0

You can finish your own proof with the expression $QD$ using this answer. Since $Q$ is p.d., $QD$ has the same eigenvalues as $Q^{1/2}DQ^{1/2}$. Since $x^TQ^{1/2}DQ^{1/2} x = (Q^{1/2}x)^TD(Q^{1/2}x) \geq 0$ (because $D$ is n.n.d.), we get that $QD$ is indeed n.n.d.

LinAlg
  • 19,822