3

I got one interesting question from matrix theory. I tried but I am not finding any clue to solve this question. I need help and suggestions. Thanks in advance.

Let $A$ be a real $n\times n$ matrix. We say that $A$ is a difference of two squares if there exist real $n\times n$ matrices $B$ and $C$ with $BC = CB = 0$ and $A = B^2 − C^2$. Now If $A$ is a diagonal matrix, then I have to show that that it is a difference of two squares.

monalisa
  • 4,460

3 Answers3

4

Perhaps this may help you .

Let $a$ be a real number. if $a\geq 0$ we can write $a = b^2 - c^2$ with $c = 0$ and if $a < 0$, we can write $a = b^2 − c^2$ with $b = 0$. Thus, in general, $a = b^2 − c^2$ with $b$ and $c$ real and $bc = 0$. Now if $A = \operatorname{diag}(a_1, a_2, . . . , a_n)$ is a real diagonal matrix, write each $a_i = b_i^{2}− c_i^2$ as above, with $b_i c_i = 0$, and define $B = \operatorname{diag}(b_1, b_2, . . . , b_n)$ and $C = \operatorname{diag}(c_1, c_2, . . . , c_n)$. Then, by the way diagonal matrices multiply, we conclude that $A = B^2−C^2$ and also that $BC = CB = 0$.

Srijan
  • 12,518
  • 10
  • 73
  • 115
2

Write your matrix $A$ as $A^+ + A^{-}$, where $A^+$ is a diagonal matrix consisting of entries on the diagonal of $A$ that are non-negative and $A^-$ is a diagonal matrix consisting of entries on the diagonal of $A$ that are negative. Now let $B = \sqrt{A^+}$ and $C = \sqrt{- A^-}$ and you are done.

1

More generally, what are the matrices $A\in M_n(\mathbb{R})$ that can be written $A=B^2-C^2$ where $B,C\in M_n(\mathbb{R}),BC=CB=0$ ?

Clearly, the result is true when $A$ is diagonalizable over $\mathbb{C}$ and $spectrum(A)\subset \mathbb{R}$.

The result is false for any nilpotent Jordan block of dimension $\geq 2$.

$\textbf{Proposition 1}$. The result is true when $spectrum(A)\subset \mathbb{R}^*$.

$\textbf{Proof}$. The problem reduces to the case where $A=I_n+J$ where $J_n$ is the nilpotent Jordan block of dimension $n$.

It suffices to take $C=0$ and $B=I+1/2J-1/8J^2+\cdots$.

$\textbf{Proposition 2}$. The result is true for any $A\in GL_n(\mathbb{R})$.

$\textbf{Proof}$. We may assume that

$A=diag(U,V)$ where $U$ is invertible without any $<0$ eigenvalues and $spectrum(V)\subset (-\infty,0[$.

The result is true for $U$ because any invertible matrix that has no $<0$ eigenvalues admits always at least one real square root.

When does a real matrix have a real square root?

According to Proposition 1, the result is also true for $V$ and we are done. $\square$