0

Title is the question, I tried taking trace both side and got trace of $A$ is zero, now to conclude $A$ is singular, suppose $A$ is non singular, then multiplying both side by Inverse of $A$ we get $B$ is similar to $B+I$, which is impossible as similar matrices have same eigen values where as eigen values of $B+I$ is one unit more than the eigen values of $B$.

Is my solution make sense? Thanks for helping, I am using Android phone so not able to write mathematical symbols as it takes too much time to write.

Myshkin
  • 35,974
  • 27
  • 154
  • 332
  • How do you get that the trace of $A$ is zero. This does not follow from taking the trace on both sides. Furthermore, finite and infinite dimensional vector spaces require a very different treatment, are your matrices finite? – Rogelio Molina Jun 06 '15 at 05:47
  • Question edited – Myshkin Jun 06 '15 at 05:56
  • 1
    Your proof is now correct, provided we are sure that $B$ has eigenvalues. – Rogelio Molina Jun 06 '15 at 05:59
  • @user7530: That is not true, it depends on which field the matrices are defined. Consider the matrix $\left( \begin{array}{cc} 0 & 1 \ 1 & 0 \end{array} \right) $ over $\mathbb{R}$, it has no eigenvaules. – Rogelio Molina Jun 06 '15 at 06:06
  • This question has already been answered here. – Martin Argerami Jun 06 '15 at 06:09
  • @RogelioMolina Huh? $1$ is an eigenvalue with eigenvector $(1,1)$. – user7530 Jun 06 '15 at 06:10
  • One of the two 1 in Rogelio's matrix should have been a $-1$. – Martin Argerami Jun 06 '15 at 06:11
  • @MartinArgerami La Belle's argument works fine even when the matrix has no real eigenvalues; just look at the eigenvalue with maximum real part of $B$ and $B+I$. You can run into trouble over finite fields but surely if La Belle intended the question in such a setting he would have specified. – user7530 Jun 06 '15 at 06:13
  • 5
    Did you just change the meaning of the question after receiving answers? That's generally a bad idea (at least, it is on Stack Overflow). – user253751 Jun 06 '15 at 06:17
  • Indeed, one was supposed to be a $-1$, my bad. – Rogelio Molina Jun 06 '15 at 06:21
  • What if I ask: "Is it true that $$ A \operatorname{singular} \Rightarrow \exists B: AB-BA=A?$$ – Ivan Di Liberti Jun 06 '15 at 06:44
  • 2
    You cannot make substantial edits to your question that change its meaning altogether and make answers from others obsolete. Please ask a new question instead. – Gabriel Romon Jun 06 '15 at 06:45
  • See also http://math.stackexchange.com/questions/284901/ab-ba-i-having-no-solutions and http://math.stackexchange.com/questions/99175/solutions-to-the-matrix-equation-mathbfab-ba-i-over-general-fields From the linked questions you can see that the problem is a bit different, depending on whether you work over $\mathbb R$ or over an arbitrary field. You should specify in your question whether you are interested only in real/complex matrices, or whether you want to discuss also more general case. – Martin Sleziak Jun 06 '15 at 08:01
  • Meta: question edited to change meaning. However, situation here is a bit more complicated. There are already several answers (and comments) about both versions of the question. – Martin Sleziak Jun 06 '15 at 08:19

5 Answers5

7

Sure, your solution makes sense to me. In fact you can argue the same thing just from the trace: if $A$ is nonsingular then $B+I = ABA^{-1}$, and taking the trace of both sides gives $\mathrm{tr}(B)+n = \mathrm{tr}(B)$, which is impossible.

user7530
  • 49,280
5

If $AB-BA = I$ then $trace(AB-BA) = trace(AB)-trace(BA) = trace(AB)-trace(AB)=0$ yet, $trace(I) = n$ for an $n \times n$ matrix. In short, it is not possible to have square matrices $A,B$ for which the commutator $[A,B] = AB-BA = I$. So, I'm not sure what the question means...

James S. Cook
  • 16,755
3

For two $n\times n$ matrices $A,B$ we cannot have $$ AB-BA = I $$ Since $$ \text{Tr}(AB) = \text{Tr}(BA) $$ and hence $ \text{Tr}(AB-BA)=0 $, while $\text{Tr}(I)=n$ (the size of the matrix)

Jolien
  • 1,645
3

In finite dimension the trace of a commutator is always zero.

$$ Tr(AB -BA) = Tr (AB) -Tr (BA) = \sum_{i,j=1}^{n} A_{ij}B_{ji} - \sum_{j,i=1}^n B_{ji}A_{ij} $$ In an infinite dimensional space this may not be the case, for example consider the space of smooth square integrable funcions $f(x)$ on $\mathbb{R}$, $L_{2}(\mathbb{R})$, and the linear operators (multiplication by) $x$ and (differentiation) $\partial =\frac{\partial}{\partial x}$, then

$$ [x, \partial] = id $$ which clearly does not have nil trace.

2

I see that the question was further modified. All this is unbearable.

About the equality $[A,B]=A$.

Case 1. The dimension is finite; $A,B\in M_n(K)$. According to the Jacobson's lemma, $[A,B]$ (and then $A$) is nilpotent. Moreover the previous result remains true when $K$ is a commutative ring with unity, under the condition that $n!$ is not a zero-divisor.

Case 2. The dimension of $E$ is infinite and $f,g\in L(E)$ are continuous. $fg-gf=f$ implies that, for every integer $k$, $f^kg-gf^k=kf^k$. Then $k||f^k||\leq 2||g||||f^k||$. Assume that, for every $k$, $f^k\not=0$; then, for every $k$, $k\leq 2||g||$, that is contradictory. Finally $f$ is nilpotent.