2

I want to find the minimal polynomial (the monic polynomial of least positive degree that annihilates the matrix) of the following matrix: $$\begin{pmatrix} 0 & 1 & 1 & 0\\ -1 & 0 & 0 & 1\\ -1 & 0 & 0 & 1\\ 0 & -1 & -1 & 0 \end{pmatrix} $$

I know how to use the characteristic polynomial to find the minimal polynomial, but is there a way to get around that? It's not quite symmetric, so I can't use the trick that applies to symmetric matrices.

Thank you.

geana
  • 153
  • Since all eigenvalues are $0$ the minimal polynomial is $x^k$ for some $k\le 4$. Clearly, $k>1$. Try $2, 3$. – markvs Apr 11 '22 at 20:48
  • No, all the eigenvalues are not zero. Only two of them are zero. The idea here is to compute $M$, $M^2$, etc. and see if you can find a simple polynomial that annihilates $M$. – KBS Apr 11 '22 at 21:05
  • @KBS I just checked that you're right. I'll delete that portion in my post. Is that the only way? – geana Apr 11 '22 at 21:17
  • @geana The matrix has a very simple structure and is skew-symmetric, which can be exploited there. It actually looks like a crystal. – KBS Apr 11 '22 at 21:27

2 Answers2

2

The matrix is skew-symmetric and one can write it as

$$M=\begin{bmatrix}r_1\\r_2\\r_2\\-r_1\end{bmatrix}$$

where the rows $r_1$ and $r_2$ are immediate from the definition. Since it is skew-symmetric, then we also have that

$$M=\begin{bmatrix}-r_1^T & -r_2^T&-r_2^T&r_1^T\end{bmatrix}.$$

We have that $r_1^Tr_2=0$ and $r_1^Tr_1=r_2^Tr_2=2$ together with $r_1M=2r_2$ and $r_2M=-2r_1$.

Therefore,

$$M^2=2\begin{bmatrix}r_2\\-r_1\\-r_1\\-r_2\end{bmatrix}\ \mathrm{and}\ M^3=-4\begin{bmatrix}r_1\\r_2\\r_2\\r_1\end{bmatrix}=-4M.$$

Therefore, the minimal polynomial is $p(x)=x^3+4x=x(x^2+4)$.

KBS
  • 7,114
2

In this answer I've explained a fairly easy method to compute minimal polynomial of a square matrix$~A$ without having to compute the successive powers of$~A$ and testing them for linear dependency. Instead choose any nonzero vector $v$ and compute successive images $A^0v=v$, $Av$, $A^2v$, until the sequence of vectors becomes linearly dependent. If the first dependency is $c_0v+c_1Av+\cdots+c_dA^dv$, where one can choose to have $c_d=1$ (by scaling the relation if needed), then the polynomial $C=c_0 +c_1X+\cdots+c_{d-1}X^{d-1}+X^d$ will certainly divide the minimal polynomial $\mu_A$. The (yet unknown) remaining factor $\mu_A/C$ can now be found as the minimal polynomial of the restriction of (left-multiplication by)$~A$ to the subspace $\operatorname{Im}C[A]$. (This sounds scarier than it is. That subspace is often quite low dimensional, often zero; in the latter case the remaining factor is just $1$.)

In this example, the simplest choice $v=(1,0,0,0)$ will in fact give you the minimal polynomial right away. Successive images are $Av=(0,-1,-1,0)$, $A^2v=A(Av)=(-2,0,0,2)$, $A^3v=(0,4,4,0)$, which is where the sequence becomes linearly dependent, with the obvious relation $4Av+A^3v=0$. This gives the polynomial $C=4X+X^3$ which satisfies $C[A]v=0$ by construction, but it also satisfies $C[A](A^kv)=0$ for $k=1,2,3$ (because the factor $A^k$ commutes with $C[A]=4A+A^3$). So to find the image subspace of $C[A]$ we can apply it to a basis of a complementary subspace to the span $W$ of $[A,Av,A^2v,A^3v]$; here $W$ is $3$-dimensional, any complement of $W$ therefore $1$-dimensional, and any vector not in $W$ will generate such a complement. Choosing such a vector $v'=(0,1,0,0)$ one finds $Av'=(1,0,0,-1)$, and $A^3v'=(-4,0,0,4)$ so that $C[A]v'=4Av'+A^3v'=0$ and the image of $C[A]$ turns out to be $0$-dimensional. The remaining factor to get from $C$ to $\mu_A$ is therefore $1$, and $\mu_A=C=4X+X^3$.

It would have been a bit more fun to start with a more "clever" choice taking $v=(0,1,1,0)$ instead (since the matrix seem to have some relation to vectors with equal second and third components); then $v$, $Av=(2,0,0,-2)$ and $A^2v=(0,-4,-4,0)$ are already linearly dependent, giving a polynomial $D=4+X^2$; this divides $\mu_A$ but is not yet $\mu_A$ itself. Now the span of $[v,Av,A^2v]$ is $2$-dimensional and a complement is for instance spanned by $[(1,0,0,0),(0,1,0,0)]$. Computing $4v'+A^2v'$ where $v'$ runs through those two vectors gives $(2,0,0,2)$ respectively $(0,2,-2,0)$, and the remaining factor of $\mu_A$ will be the minimal polynomial of $A$ restricted to the $2$-dimensional subspace spanned by $[(1,0,0,1),(0,1,-1,0)]$. But the restriction of $A$ to that subspace is zero, and the minimal polynomial of the restriction therefore $X$ (note that the minimal polynomial of a zero operator on any space of positive dimension is $X$), leading to $\mu_A=(4+X^2)X$, indeed the same answer as before.