Let $\mathcal{A}$ be a linear operator on a linear space $V$ of dimension $n$. Suppose that $\mathcal{I, A}, \cdots, \mathcal{A}^{n-1}$ are linear independent. Then how to prove that there exists $v \in V$ such that $$V = \langle v, \mathcal{A}v, \cdots, \mathcal{A}^{n-1}v \rangle.$$ (Please prove it without the Jordan normal form)
1 Answers
The linear independence of $\{I, A, \ldots, A^{n-1}\}$ is equivalent to the fact that the minimal polynomial $m_A$ has degree $n$.
Let $m_A(x) = (x - \lambda_1)^{p_1}\cdots (x - \lambda_k)^{p_k}$ with $\sum_{i=1}^k p_i = n$ and consider the generalized eigenspace decomposition
$$V = \ker (A-\lambda_1I)^{p_1} \dot+ \cdots \dot+ \ker (A-\lambda_kI)^{p_k}$$
Define $v = v_1 + \cdots + v_k$ with $v_i \in \ker (A-\lambda_iI)^{p_i} \setminus \ker (A-\lambda_iI)^{p_{i}-1}$. Note that $\{v_1, \ldots, v_k\}$ is linearly independent.
For any polynomial $q \ne 0$ of degree $\le n-1$ there exists $j \in \{1, \ldots, k\}$ such that $(x-\lambda_j)^{p_j}$ does not divide $q$ (otherwise we would have $m_A \mid q$ so $\deg q \ge n$).
Since each generalized eigenspace is $A-$invariant, we have $q(A)v_i \in \ker (A - \lambda_i I)^{p_i}$ so $\{q(A)v_1, \ldots, q(A)v_k\}$ is again linearly independent.
Therefore, if we assume $q(A)v = 0$ we get
$$0 = q(A)v = \sum_{i=1}^n q(A)v_i \implies q(A)v_i = 0, \forall i = 1, \ldots, k$$
In particular $q(A)v_j = 0$. Assume $q(x) = (x - \lambda_j)^{r_j}g(x)$ with $r_j < p_j$ by assumption and $g(\lambda_j) \ne 0$.
We have $(A - \lambda_j I)^{r_j}v_j \ne 0$ and $(A - \lambda_j I)^{r_j}v_j \in \ker(A - \lambda_j I)^{p_j}$.
If $\mu \ne \lambda_j$ and $(A - \mu I)x = 0$ for some $x \in \ker(A - \lambda_j I)^{p_j}$, then necessarily $x = 0$ because $\ker(A - \lambda_j I)^{p_j} \cap \ker (A - \mu I) = \{0\}$. Therefore each linear factor of $g(A)$ applied to $(A - \lambda_j I)^{r_j}v_j$ cannot give the zero vector. After each application of a linear factor, the result remains in $\ker(A - \lambda_j I)^{p_j}$ so inductively we conclude $g(A)v_i \ne 0$.
Hence $$q(A)v_j = g(A) (A - \lambda_j I)^{r_j}v_j \ne 0$$ which is a contradiction.
Hence for every nonzero polynomial $q$ of degree $\le n-1$ we have $q(A)v \ne 0$. Therefore, the set $\{v, Av, \ldots, A^{n-1}v\}$ is linearly independent, hence a basis for $V$.

- 67,606

- 46,490
-
Aha, I think you assume that the field is algebraic closed here. But what about for a general field $k$, i.e., $V$ is a linear space over $k$? I even don't know whether the result is valid for a general field $k$ or not. – SWalker May 24 '18 at 10:39
-
@SWalker I believe the statement holds in general. Namely, have a look at this answer. It says that a matrix $A$ possesses a cyclic vector (a vector $v$ such that ${v, Av, \ldots, A^{n-1}v}$ is a basis) if and only if the degree of the minimal polynomial is equal to $n= \dim V$, which is equivalent to linear independence of ${I, A, \ldots, A^{n-1}}$. However, I don't know the proof, and the source of the claim is the French Wikipedia, also without proof. – mechanodroid May 24 '18 at 21:17
-
Now I'm sure that the result is also valid in general by considering the the Frobenius normal form (or rational canonical form). And we can obtain a direct proof by modifying your method. – SWalker May 25 '18 at 07:27