I want to prove if $A$ is nilpotent matrix then $tr(A)=0$. I have seen some answers on MSE about this question but none of them were clear enough to be understood for me. I appreciate if someone explains the proof in mathematical notation rather than a general explanation about it.
-
1http://math.stackexchange.com/questions/617500/direct-proof-that-nilpotent-matrix-has-zero-trace – darij grinberg Dec 31 '16 at 19:02
-
Also, http://math.stackexchange.com/questions/1220470/trace-of-a-nilpotent-matrix-is-zero – darij grinberg Dec 31 '16 at 19:03
-
@darijgrinberg None of the answers you posted are explaining the proof in mathematical notation. In addition, I had read all of the answers regarding the proof of this question before on MSE. I found them pretty ambiguous. – FreeMind Dec 31 '16 at 19:05
-
2Try asking a more specific question then. What is "mathematical notation" to you? I haven't seen a fully formalized proof so far, although I wouldn't be surprised to find one in the MathComp (ssreflect) library for Coq. Meanwhile, you can find yet another proof (found by Peter Scholze, posted by me) at http://artofproblemsolving.com/community/c7h233169_nilpotent_matrices_have_nilpotent_traces ; this one is less elementary, but avoids arguments such as a recursively constructed basis, which might make it easier to formalize. – darij grinberg Dec 31 '16 at 19:10
-
1@FreeMind I am not sure what you mean by "mathematical notation" in this context. The proof which goes via analysing the kernels of the powers of $A$ relies on the fact that you can extend the basis of a subspace of a vector space to a basis of the whole space (as can be done with any linearly independent set). It isn't unmathematical in any way. – Mark Bennet Dec 31 '16 at 19:11
-
By the way, the proof via eigenvalues requires the existence of an algebraic closure, which is a big assumption to make. – Mark Bennet Dec 31 '16 at 19:18
-
@MarkBennet Why so? As the only eigenvalue of a nilpotent matrix is zero I can't see why would we want to mess with existence of algebraic closures...or with any field extension of the given field, for that matter. – DonAntonio Dec 31 '16 at 21:05
-
@DonAntonio Mine was a tired comment, but some proposed solutions assume the existence of an eigenvalue before proving it to be equal to zero. It is difficult to understand exactly what the problem is here, but I think that assuming the existence of eigenvalues is not necessary. I noted also after making my comment that the question does not specify a matrix over a field. – Mark Bennet Dec 31 '16 at 21:39
-
1@mark, in the case of nilpotent maps, there is no need to assume that there are eigenvalues: zero is an eigenvalue! – Mariano Suárez-Álvarez Dec 31 '16 at 22:05
-
@MarianoSuárez-Álvarez I think, in the context of this question, that might be one of the things which needs proof/explanation?? – Mark Bennet Dec 31 '16 at 22:20
-
@Mark Very easy to prove. Let $v$ be a vector such that $A^{n-1}v$ is not zero. Then $A^{n-1}v$ is an eigenvector with eigenvalue zero. – Matt Samuel Dec 31 '16 at 22:29
-
Delete voters beware: This question has a well-written answer by @user1131274. – darij grinberg Feb 09 '19 at 22:12
4 Answers
In general the trace of a matrix is the sum of the eigenvalues in the algebraic closure. Suppose $\lambda$ is an eigenvalue of $A$ and let $v\neq 0$ be such that $$Av=\lambda v$$ Suppose $A^n=0$ for some $n$. Then $$A^nv=\lambda^nv=0$$ Since $v\neq 0$, we have that $\lambda^n=0$, hence $\lambda=0$. Thus all eigenvalues of $A$ are zero hence the trace is zero.

- 58,164
If $A\in\mathbb{K}^{n\times n}$ is nilpotent, then its characteristic polynomial is $\chi_A (\lambda)=(-1)^n\lambda^n$ and in general por any $M\in\mathbb{K}^{n\times n}$, $\chi_M (\lambda)=(-1)^n\lambda^n+(-1)^{n-1}(\text{trace }M)\lambda^{n-1}+\cdots+\det M$ so, $\text{trace }A=0$.

- 3,537
-
How do you know that that is the characteristic polynomial? – Mariano Suárez-Álvarez Dec 31 '16 at 19:32
-
-
-
-
1Well, the result you are trying to prove is also a well known property. That sort of explanation is not very helpful to anyone, is it? – Mariano Suárez-Álvarez Jan 01 '17 at 02:08
-
That the characteristic polynomial of a nilpotent matrix is a monomial is a much stronger result than that it's trace is zero. – Mariano Suárez-Álvarez Jan 01 '17 at 02:11
-
That is not an essential question. If a student has in his/her list of theoric properties the results I've considered, is reasonable that he/her could use them. Another thing is if you consider that my approach is not sufficientlyl"auto-content". If this is the case, well, no problem. – Fernando Revilla Jan 01 '17 at 02:27
-
My point is that it is not trivial to show that the characteristic polynomial of a nilpotent matrix has that form. Your answer would be more useful if it made explicit what reasoning leads to that conclusion, even if only sketchily, or at the very least tell the reader that you are skipping essentially all of the argument. It is your prerogative, of course, to write an answer that even possibly confounds the reader. – Mariano Suárez-Álvarez Jan 01 '17 at 04:26
-
I will try to give some explanation of the proofs given earlier. First of all note that trace is cyclic which means $tr(ABC)=tr(CAB)=tr(BCA)$, which gives us following equality $tr(P^{-1}AP)=tr(A)$. So trace is invariant under change of basis.
Now as $A$ is nilpotent, there exist some positive integer $k$ such that $A^k=0$. Now because $A^k=0$, the kernel of $A^k$ is the whole space. Also note that if $v \in \ker(A^j)$ then $v\in \ker (A^{j+1})$, as $A^{j+1}v=AA^jv=0$. So we have the following relation $\{0\}=\ker A^0\subseteq\ker A^1\subseteq\ker A^2\subseteq\cdots \subseteq \ker A^k=V $. Choosing a basis of $\ker A^1$, then extending to a basis of the next space $\ker A^2$, and so on, eventually gives a basis of the whole space, because $\ker A^k=V$. By construction change of basis of $A$ to this new basis leads to a strictly upper triangular matrix, to see this note that for $v\in \ker A^j$, we have $A^jv=0 \rightarrow A^{j-1}Av=0 \rightarrow Av \in \ker A^{j-1}$. So for any $v \in \cup_{i=1}^j \ker A^i$, we have $Av \in \cup_{i=1}^{j-1} \ker A^i$. And so A is strictly upper triangular matrix in our constructed basis. So in this new basis we have trace equal 0 and because trace is invariant to change of basis, $tr(A)=0$.

- 1,678
-
This is the same as the answer to the other question. Why repost it? – Matt Samuel Dec 31 '16 at 19:30
-
-
This is essentially the construction of the Jordan canonical form for nilpotent maps... Going through that to prove what the OP wants is quite a detour. – Mariano Suárez-Álvarez Dec 31 '16 at 19:34
-
-
@eric, sure! You can observe that the eigenvalues of a nilpotent matrix are all zero and the only roots of the characteristic polynomial, or you can compute the minimal polynomial and use the fact that every root of the characteristic polynomial is a root of the minimal one, and other ways. Going all the way to the Jordan normal form is too much! – Mariano Suárez-Álvarez Dec 31 '16 at 22:00
-
Another thing you can do is (over an algebraically closed field) to show that every matrix is similar to one which is upper triangular — which is much easier than Jordan (proceed by induction; if f is an endomorphism of V, pick an eigenvector for the transpose of f: its kernel is a codimension one invariant subspace and lets you use the induction hypothesis) – Mariano Suárez-Álvarez Dec 31 '16 at 22:01
Let $\lambda$ be an eigenvalue of $A$, and choose some associated eigenvector $v$. Now, we show $A^kv = \lambda^kv$. True for $k=1$, and then $A^nv = A^{n-1}(Av) = \lambda A^{n-1}v = \lambda^nv$ by induction. Now, assume our nilpotent matrix has some eigenvalue $\lambda \ne 0$. We know that $A^k = 0$ for some $k$, but then $\lambda^k \ne 0$ is an eigenvalue of the zero matrix, which is nonsense. Thus, all eigenvalues of $A$ are $0$, as well as their sum (the trace).

- 1,732