-2

Suppose $A$ is a $2\times2$ matrix and $A=(-A)^T$. Prove that $A+cI$ is invertible for all $c\in\mathbb R$.

Nick
  • 1,231

3 Answers3

0

$A=(-A)^T$ means $A=\pmatrix{0&b\\-b&0},$ so $A+cI=\pmatrix{c&b\\-b&c}$,

so $\det (A+cI)=c^2+b^2\ne0$ as long as $b\ne0$ or $c\ne0$,

so $A+cI$ is invertible.

J. W. Tanner
  • 60,406
0

If $A= \begin{pmatrix} a & b \\ e & f \end{pmatrix}$, then $(-A)^T=\begin{pmatrix} -a & -e \\ -b & -f \end{pmatrix}$. The fact that $A= (-A)^T$, we have $a=-a$ and $f=-f$ so that $a=f=0$. We have also $b=-e$ and $e=-b$. Then $A= \begin{pmatrix} 0 & b \\ -b & 0 \end{pmatrix}$. But then $A+cI= \begin{pmatrix} c & b \\ -b & c \end{pmatrix}$. Taking the determinant gives $c^2+b^2>0$ so long as $b$ and $c$ are not both $0$, i.e. the comment by Tanner on your question. But then the determinant is not zero so the matrix is invertible. You can even give an explicit description of it using the normal 2$\times$2 inverse formula!

  • Can it be solve by the definition that says A(A)^-1=I? – Alejandro Garcia Jan 28 '20 at 02:53
  • @JeoshuaAlejandroLiceaGarcia I see no immediate way of going about that. For one, you are interested in the invertibility of $A+cI$, not $A$. To work with $AA^{-1}$, you need $A^{-1}$ to exist, which you are not told. This is certainly the case if $b \neq 0$ from our work. But this is the work needed to show this and essentially solves the problem so there would be no need for the detour of showing $A$ is invertible unless it is the zero matrix. – mathematics2x2life Jan 28 '20 at 03:20
  • i tried to say (A+cI)X=I, where exists some matrix X that prove the definition – Alejandro Garcia Jan 28 '20 at 03:33
  • @JeoshuaAlejandroLiceaGarcia Again, that is the whole point. To even write down $(A+cI)X=I$, such a matrix $X$ has to exist and if so is the inverse of $A+cI$. But this means $A+cI$ is invertible - which is what you are trying to show. You cannot assume what you want to prove. Insofar as I am aware, the method I and J.W. Tanner present is the simplest possible argument. – mathematics2x2life Jan 28 '20 at 04:05
0

Assuming $, k\ne 0\in \mathbb R,c\ne 0\in \mathbb R$.

Eigenvalues of $A= 0,0$ or $ \pm ki$. Hence eigenvalues of $A+cI= c,c$ or $c\pm ki$. So $det(A+cI)=c^2$ or $c^2+k^2$.

Nitin Uniyal
  • 7,946