1

I have been going over different characterizations of when a matrices characterstic polynomial equals its minimal polynomial and came across a result that I was not aware of and seems a little strange at first glance. In any case after trying a couple computations I am stuck on a way to prove the following.

Let $M$ be a matrix over a field $F$. Suppose that for every vector $a = ^t(a_1, \ldots, a_n)$ there exists column vectors $b,c$ such that $a_k = ^t b (M^k) c$ for each $1 \leq k \leq n$

How do we show the characteristic polynomial of $M$ is equal to its minimal polynomial?

user7980
  • 3,073
  • I think what you mean is that: "Suppose that for every vector $a=^t(a_1,...,a_n)$ there exists column vectors $b_k, c_k$ such that $a_k=^tb_k(M)c_k$ for each $1\leq k\leq n$" – Paul Nov 17 '11 at 11:46
  • @Paul: no that wouldn't make sense, the right hand being scalarmatrixscalar while the left hand is scalar. – Marc van Leeuwen Nov 17 '11 at 12:54

1 Answers1

2

I think $k$ should range from $0$ to $n-1$ (everywhere), rather than from 1 to $n$. I'll suppose that for now and come to the original question later. The minimal polynomial will be equal to the characteristic polynomial if and only if it is of degree $n$, by the Cayley-Hamilton theorem (assuming the characteristic polynomial is defined as unitary). This is equivalent to $M^0$, ... , $M^{n-1}$ being linearly independent in the vector space of matrices (since then the minimal polynomial cannot have degree less than $n$).

Now apply your hypothesis with for $a$ a standard basis vector: all entries zero except one, say $a_i=1$: let $b,c$ be vectors valid for this $a$. Write a hypothetical linear dependency $\mathbf0=d_0M^0+\cdots+d_{n-1}M^{n-1}$ between the matrices, and sandwich both sides between $^tb$ on the left and $c$ on the right. In the right hand side all terms vanish except $d_iM^i$ which becomes $d_i$. Thus that coefficient must be $0$, and the same goes for all other coefficients, so the linear combination is trivial. Whence the independence of the first $n$ powers of $M$.

Finally with the hypothesis given for $k$ running from 1 to $n$, the same argument shows that $M^1$, ... , $M^n$ are linearly independent. This is not equivalent to the first $n$ powers being independent (think of a nilpotent matrix of nilpotency order $n$) but it is stronger: if $M^0$, ... , $M^{n-1}$ were linearly dependent, then by multiplying by $M$ one finds that $M^1$, ... , $M^n$ are also linearly dependent.