5

I am trying to work out problems from Linear Algebra, by Hoffman and Kunze and came across this problem in the exercise of Section 3.4, I have a difficulty solving the (c) part of the problem.

Problem 12

If $V$ is an $n$ - dimensional vector space over the field $F$, and let $B = \{\alpha_1,\alpha_2,\alpha_3, \cdots \alpha_n\}$ be an ordered basis for $V$.

(a) There is a unique Linear Operator $T$ on $V$ such that :

$$ \begin{align} T(\alpha_j) = \alpha_{j+1}, \quad j=1, \cdots ,n-1 \quad T(\alpha_n)=0 \end{align} $$

What is the matrix $A$ of $T$ in the ordered basis $B$?

(b) Prove that $T^n=0$, but $T^{n-1} \neq 0$

(c) Let $S$ be any linear operator on $V$ such that $S^n=0$, but $S^{n-1} \neq 0$ . Prove that there is an ordered basis $B^{'}$ for $V$ such that the matrix of $S$ in the ordered basis $B^{'}$ is the matrix $A$ of part (a).

(d) Prove that if $M$ and $N$ are $n \times n$ matrices $F$ such that $M^n = N^n = 0$, but $M^{n-1} \neq 0 \neq N^{n-1}$, then $M,N$ are similar.


However the solution for the problem (c) can be obtained assuming (d) is true, in the following way:

Let $[S]_B$ represent the matrix corresponding to the linear operator $S$ under some basis $B$, and we want to show that this operator has the matrix $[S]_{B^{'}}=A$ under some basis $B^{'}$. If the two nilpotent matrices of order $n$ are similar, then there must be an invertible matrix $P$ such that $[S]_{B^{'}} = P^{-1}[S]_BP$. The matrix $P$ can be used to prove the existence of another basis $B^{'}$ such that: $$ \begin{align} [\alpha]_{B} = P[\alpha]_{B^{'}} \end{align} $$

and therefore, there indeed exists a basis $B^{'}$ such that $$ \begin{align} [S]_{B^{'}} = P^{-1}[S]_BP \end{align} $$


I am also having a problem solving part (d), however, I was wondering if one could prove (c) without using (d) explicitly ? I will be thankful for any hints to solve the problem!

Chameleon
  • 169

3 Answers3

6

Since $S^{n-1}\neq 0$, there exists a vector $\alpha_0$ (necessarily non-zero) such that $S^{n-1}(\alpha)\neq 0$. Consider the vectors $\alpha,S(\alpha),S^2(\alpha),\dots S^{n-1}(\alpha)$ They are all non-zero (or else the last one would be too by linearity of $S$). Moreover, they are linearly independent, for suppose there are scalars $a_k\in F$ so that $\sum_{k=0}^{n-1}a_kS^k(\alpha)=0$, where $S^0=I$. Applying $S^{n-1}$ we get $\sum_{k=0}^{n-1}a_kS^{k+n-1}(\alpha)=0$, so $a_0S^{n-1}(\alpha)=0$, because all the terms with $k+n-1\geq n$ vanish, and we are left just with the first one. Since $S^{n-1}(\alpha)\neq 0$, we must have $a_0=0$. So now we have $\sum_{k=1}^{n-1}a_kS^k(\alpha)=0$. Applying $S^{n-2}$, we get $a_1=0$, and continuing this way we get $a_0=a_1=\cdots = a_{n-1}=0$. Thus, the vectors $\alpha,S(\alpha),\dots S^{n-1}(\alpha)$ are linearly independent. Since $V$ is $n$-dimensional, these $n$ vectors form a basis for $V$. Furthermore, the matrix representing $S$ with respect to this basis satisfies our requirements, because $S(\alpha_k)=\alpha_{k+1}$ for $k=0,1,\dots, n-2$, and $S(\alpha_{n-1})=S^n(\alpha)=0$, which by part (a) is what it takes for the matrix representation of an operator to have the desired form.

6

(c) Let $S$ be any linear operator on $V$ such that $S^n=0$, but $S^{n-1} \neq 0$. Prove that there is an ordered basis $\mathfrak{B}'$ for $V$ such that the matrix of $S$ in the ordered basis $\mathfrak{B}'$ is the matrix $A$ of part (a).

Since $S^{n-1} \neq 0$, there exists $v \in V$ such that $S^{n-1} v\neq 0$. Let$$\mathfrak{B}' = \{v, Sv, S^2v, \dots, S^{n-1}v\}.$$If we can show that $\mathfrak{B}'$ is a basis, the definition of $\mathfrak{B}'$ makes it clear that $[S]_{\mathfrak{B}'} = A$.

Since there are $n$ vectors in $\mathfrak{B}'$ and $V$ is an $n$-dimensional vector space, it is enough to show $\mathfrak{B}'$ is linearly independent. Suppose$$c_1v + c_2Sv + \dots + c_nS^{n-1}v = 0.$$Apply $S^{n-1}$ to both sides. Since $S^n = 0$, all the terms except the first one vanish, and we have $c_1 S^{n-1} v = 0$, and hence $c_1 = 0$ because $S^{n-1} v \neq 0$. Now, we can similarly apply $S^{n-2}$ to show that $c_2 = 0$, and so on (this may be formalized by induction, if desired), and we conclude that all the $c_j$ are $0$ and $\mathfrak{B}'$ is linearly independent, and hence a basis.

(d) Prove that if $M$ and $N$ are $n \times n$ matrices over $F$ such that$$M^n = N^n = 0$$but$$M^{n-1} \neq 0 \neq N^{n-1},$$then $M$ and $N$ are similar.

Let $U$ be the linear operator on $V$ whose matrix with respect to the ordered basis $\mathfrak{B}$ is $M$. Then $U$ satisfies the conditions of part (c), so there exist an ordered basis for which the matrix of $U$ is $A$. Hence, $M$ is similar to $A$. Similarly, $N$ is similar to $A$, so since similarity is an equivalence relation, $M$ and $N$ are similar.

0

We have, $$\{0\} = \operatorname{Range} S^n \leq \operatorname{Range} S^{n-1} \leq \dots \leq \operatorname{Range} S \leq V,$$ and $\operatorname{Range} S^{i} \not = \operatorname{Range} S^{i+1}$ for $i < n,$ as can be seen by applying Rank$-$Nullity Theorem to the operator $S$ restricted to $\operatorname{Range} S^i.$ Thus, there is some non-zero vector $\alpha_i$ in $\operatorname{Range} S^i$ which is not in $\operatorname{Range} S^{i+1},$ for each $i = 0, 1, 2, \dots, n-1.$ This shows that $$\dim\operatorname{Range} S^n \leq \dim \operatorname{Range} S^{n-1} \leq \dots \leq \dim \operatorname{Range} S$$ is an increasing (finite) sequence of non zero positive integers less than $n.$ Applying the Piegeonhole Principle here, we get that each of these spaces has dimension one less than the next one. So, for $i = 0,1, \dots, n-1$ there is exactly one vector $\alpha_i$ which is in $\operatorname{Range} S^i$ but not in $\operatorname{Range} S^{i+1},$ where $S^0$ is the identity operator. This shows that $\operatorname{kernel} S$ is spanned by the vector $\alpha_0$ and $\operatorname{Range} S$ is spanned by $\alpha_1, \dots, \alpha_{n-1}.$ The matrix of $S$ in the basis $\alpha_0, \dots \alpha_{n-1}$ is upper triangular with all diagonal elements zero. It can be reduced to a matrix of the following form, which has entries zero except the ones in the diagonal above the main diagonal which are all 1. $$\begin{bmatrix} 0 & 1 & 0 & \dots & 0 \\ 0 & 0 & 1 & \dots & 0 \\ \vdots & \vdots & \ddots & & \vdots\\ 0 & 0 & 0 & \dots & 1 \\ 0 & 0 & 0 & \dots & 0 \\ \end{bmatrix}$$