Let $A$ and $B$ be two square matrices over a field such that $A+A^2B+B=0$. Is it true that $A^2+I$ is always invertible ?
3 Answers
We have $A+(A^2+I)B=0$. We multiply by $A$:
$$A^2+(A^2+I)BA=0$$
We add $I$:
$$I+A^{2}+(A^2+I)BA=I=(A^2+I)(I+BA)$$ Hence $A^2+I$ is invertible, and its inverse is $I+BA$.

- 18,734
- 26
- 36
-
See my answer for a very simple way to motivate this, including discovering the inverse (if need be). – Bill Dubuque Jan 03 '15 at 16:38
By taking the transpose, this is equivalent to asking whether $A + BA^2 + B = 0$ implies $A^2+I$ invertible. $(A^2+I)v = 0$ for some vector. Then $0 = (A + BA^2 + B) v = Av + B(A^2+I)v = Av$, therefore $v = (A^2+I)v = 0$. Therefore $A^2+I$ has trivial kernel and is invertible.

- 54,185
${\bf Hint}\ \ 1\!+\!aa\mid \color{#c00}a\,\Rightarrow\, 1\!+\!aa \mid 1 = 1\!+\!aa-\color{#c00}aa,\ $ e.g. applying this to the special case at hand:
$\quad\ \ \color{#0a0}{{-}(1\!+\!aa)b} = \color{#c00}{a},\ $ hence $\ 1 = 1\!+\!aa-\color{#c00}{a}a = 1\!+\!aa\color{#0a0}{+(1\!+\!aa)b}a=(1\!+\!aa)(1\!+\!ba) $
Remark $\ $ Essentially we exploit the universality of the Bezout identity $\,1\!+\!aa-\color{#c00}aa = 1.$

- 272,048
-
-
@aximut Since $,1!+!aa,$ divides $,1!+!aa,$ and $,\color{#c00}a,,$ it divides $,1!+!aa -\color{#c00}aa,,$ e.g. see the explicit proof of the special case above. – Bill Dubuque Jul 11 '14 at 11:27