8

Let $A$ and $B$ be two square matrices over a field such that $A+A^2B+B=0$. Is it true that $A^2+I$ is always invertible ?

3 Answers3

27

We have $A+(A^2+I)B=0$. We multiply by $A$:

$$A^2+(A^2+I)BA=0$$

We add $I$:

$$I+A^{2}+(A^2+I)BA=I=(A^2+I)(I+BA)$$ Hence $A^2+I$ is invertible, and its inverse is $I+BA$.

Kelenner
  • 18,734
  • 26
  • 36
10

By taking the transpose, this is equivalent to asking whether $A + BA^2 + B = 0$ implies $A^2+I$ invertible. $(A^2+I)v = 0$ for some vector. Then $0 = (A + BA^2 + B) v = Av + B(A^2+I)v = Av$, therefore $v = (A^2+I)v = 0$. Therefore $A^2+I$ has trivial kernel and is invertible.

Najib Idrissi
  • 54,185
2

${\bf Hint}\ \ 1\!+\!aa\mid \color{#c00}a\,\Rightarrow\, 1\!+\!aa \mid 1 = 1\!+\!aa-\color{#c00}aa,\ $ e.g. applying this to the special case at hand:

$\quad\ \ \color{#0a0}{{-}(1\!+\!aa)b} = \color{#c00}{a},\ $ hence $\ 1 = 1\!+\!aa-\color{#c00}{a}a = 1\!+\!aa\color{#0a0}{+(1\!+\!aa)b}a=(1\!+\!aa)(1\!+\!ba) $

Remark $\ $ Essentially we exploit the universality of the Bezout identity $\,1\!+\!aa-\color{#c00}aa = 1.$

Bill Dubuque
  • 272,048
  • Why $1 + aa \mid a \implies 1 + aa \mid 1$? (I'm not the downvoter.) – azimut Jul 11 '14 at 10:56
  • @aximut Since $,1!+!aa,$ divides $,1!+!aa,$ and $,\color{#c00}a,,$ it divides $,1!+!aa -\color{#c00}aa,,$ e.g. see the explicit proof of the special case above. – Bill Dubuque Jul 11 '14 at 11:27