4

I wish to prove the following: if $T:X\to X$ is a nilpotent (and the linear space $X$ is finite-dimensional), then there exists a basis of $X$ such that the matrix representation of $T$ is upper triangular with zero diagonal elements.

I was trying to prove $\{x_1,...,x_{k+1},v, Tv, ..., T^{n-k-2}v\}$ is a basis for $X$ where $x_i$ are basis for $N(T)$ and $v,..., T^{n-k-2}v$ are basis for $R(T)$ where $T^{n-k}=0$ and $n-k$ is minimal. However the matrix representation w.r.t this basis is diagonal. Is there an easy way to prove this ?

I know that for all $T:X\rightarrow X$, there exists a basis that the matrix representation is upper triangular. How is this useful?

Jean Marie
  • 81,803
nagnag
  • 1,773
  • You're trying to re-derive the Jordan Canonical Form. If you know this theorem, just apply it to get one or more Jordan blocks with eigenvalue $0$. Each such Jordan block is a representation with respect to a vector $x$ and its non-zero orbits under $A$ given by $x \mapsto^{T} Tx \mapsto^{T} T^{2}x\cdots \mapsto^{T} = 0$. – Disintegrating By Parts Nov 03 '14 at 06:23
  • I know I can use the Jordan form, but how do you show that $J$ has 0 on the diagonal, in other words, zero eigenvalue for nilpotent matrix – nagnag Nov 03 '14 at 06:38
  • If $Tx=\lambda x$, then apply $T$ enough times that $T^{n}x=0$ but $T^{n-1}x\ne 0$. Then $\lambda T^{n-1}x=0$ which gives $\lambda = 0$. Alternatively, $p(T)=0$ where $p(\lambda)=\lambda^{n}$ for some positive power $n$. So the minimal polynomial for $T$ divides $\lambda^{n}$, and the only root is $\lambda=0$. – Disintegrating By Parts Nov 03 '14 at 06:42

3 Answers3

2

First of all, it is not necessarily the case that the elements $T^j v$ for $j=0,\cdots, n-k-1$ span the range of $T$. You may very well need many $v_i$, but let's just say that one $v$ is enough. Then you need to prove that these elements are linearly independent (where $v$ here is an element such that $T^{n-k-1}v\neq 0$). This is not hard.

Once you do this, you are done, because you have a basis for $X$ (along with the kernel of $T$ as you remark) and all you need to do is form the matrix of $T$ with respect to the basis $\{x_1,\cdots, x_{k+1},v,\cdots T^{n-k-1}v\}$.

For instance, this matrix is (say $k=0$ and $n=3$) $$\begin{bmatrix} 0 & 1 & 0\\ 0 & 0 & 1\\ 0 & 0 & 0\end{bmatrix}$$ just as you wanted.

I think you mistakenly believe that the matrix you are getting is

$$\begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 0\end{bmatrix}$$ or something similar. This is not correct; if this were true, $T$ would carry $v$ to $v$, $Tv$ to $Tv$ etc. It does not, it shifts the basis elements. Try it yourself with $v=(0,0,1)=e_3$ and $T(e_i)=e_{i-1}$ to get a feel for the form.

In general you need to consider all $v_i$ that form so called cyclic subspaces of $X$ under $T$, take them all together with the kernel and glue the transformations together in a big direct sum. Here is a set of notes I just googled which contains details: http://www.mth.msu.edu/~shapiro/pubvit/Downloads/CycNilp/CycNilp.pdf

guest
  • 4,772
  • I how do you obtain this matrix $\begin{pmatrix} 0 \ 1 \ 0\ 0 \ 0 \ 1\ 0 \ 0 \ 0\end{pmatrix}$ above? If ${x_1, v, Tv}$ is a basis , so $T^3=0$, we have: $$T(x_1)=0 x_1+0v+0Tv, T(v)=0x_1+ 0v+1Tv, T(Tv)=T^2v\in N(T)$$ so your matrix looks something like this:$\begin{pmatrix} 0 & 0 & X\ 0 & 0 & 0 \ 0 & 1 & 0\end{pmatrix}$. Sorry for any confusion. Your matrix given is what I think the bases should give it to me – nagnag Nov 03 '14 at 15:17
  • Jordan decomposition of $A=\left( \begin{array}{ccc} 1 & -\frac{1}{2 x} & 0 \ x & 0 & y \ 0 & -\frac{1}{2 y} & -1 \ \end{array} \right)$ gives $\left( \begin{array}{ccc} 0 & 1 & 0 \ 0& 0 & 1 \ 0 &0 & 0 \ \end{array} \right)$ Notice that $A$ is nilpotent of degree 3 for any non-zero $x$ and $y$. – Dr. Wolfgang Hintze Feb 20 '19 at 12:53
0

If an upper triangular matrix is nilpotent then its diagonal is zero and hence it is strictly upper triangular. Indeed, if $A$ and $B$ are upper triangular then $C := A\cdot B$ is also upper triangular and the diagonal of $C$ is the component-wise product of the diagonals of $A$ and $B$.

orangeskid
  • 53,909
  • Thanks, but this is not what I am trying to prove – nagnag Nov 03 '14 at 02:58
  • @amathnerd: You have an answer for your question: "I know that for all $T:X\to X$, there exists a basis that the matrix representation is upper triangular. How is this useful?"... or just remove it if that is not part of your question anymore ... and I will remove it too, let me know – orangeskid Nov 03 '14 at 04:30
0

The existence of a basis of which $T$ is strictly upper-triangular is equivalent to the existence of a flag $X = X_0 > X_1 > \cdots > X_n = 0$ with $T(X_i) \subset X_{i+1}$. To show the latter, note that any subspace $V$ with $T(V)\subset V$ must have $\ker T\vert V \not =0$ and thus $\text{im}\; T < V$.

anomaly
  • 25,364
  • 5
  • 44
  • 85