1

$\textbf{Question:}$ Find a basis for the vector space of all $2\times 2$ matrices that commute with $\begin{bmatrix}3&2\\4&1\end{bmatrix}$, which is the matrix $B$. You must find two ways of completing this problem for full credit.

$\textbf{My Attempt:}$ I found that $B$ is diagonalizable, and so any other diagonalizable $2\times2$ matrix $A$ will satisfy $AB=BA$. However, I cannot think of a way to form a basis for all $2\times2$ diagonalizable matrices. I tried to start with a diagonal matrix with distinct entries on its diagonal, but ended up running into a lot of dead ends.

Does anyone else have any ideas on how I might find this basis? Does anyone have any other potential methods of solving this problem?

PinkyWay
  • 4,565
  • $AB = BA$ only if $A$ and $B have the same eigenvectors. So you should find these. – Hans Engler Nov 20 '16 at 22:29
  • Another approach is to write out the condition $AB - BA = 0$ as a system of homogeneous linear equations for the entries of $A$. Then solve this system. – Hans Engler Nov 20 '16 at 22:31
  • Yeah, you had it the wrong way : $AB=BA$ and $B$ diagonalizable means the eigenspaces of $B$ are stable for $A$. As those eigenspaces are of dimension one, it means $A$ is also diagonalizable, in the same basis. – Nicolas FRANCOIS Nov 20 '16 at 22:32
  • 1
    No, not every diagonalizable $2 \times 2$ matrix will commute with $B$. Moreover, the diagonalizable matrices do not form a vector space, so there is no such thing as a basis for them. – Robert Israel Nov 20 '16 at 22:33
  • You need to change your title; it makes no sense (by the previous comment). – Marc van Leeuwen Nov 22 '16 at 07:44

3 Answers3

3

As you noted, the matrix $B$ is diagonalizable, and we have: $$ B=\begin{bmatrix} 3 & 2\\ 4 & 1 \end{bmatrix}=SDS^{-1}= \begin{bmatrix} -1 & 1\\ 2 & 1 \end{bmatrix} \begin{bmatrix} -1 & 0\\ 0 & 5 \end{bmatrix} \begin{bmatrix} -1/3 & 1/3\\ 2/3 & 1/3 \end{bmatrix} $$

A matrix $A$ commutes with $B$ iff they are simultaneously diagonalizable, and this means that $A$ has the form:

$$ A=\begin{bmatrix} -1 & 1\\ 2 & 1 \end{bmatrix} \begin{bmatrix} a & 0\\ 0 & b \end{bmatrix} \begin{bmatrix} -1/3 & 1/3\\ 2/3 & 1/3 \end{bmatrix} =\frac{1}{3}\left\{ a\begin{bmatrix} 1 & -1\\ -2 & 2 \end{bmatrix} +b\begin{bmatrix} 2 & 1\\ 2 & 1 \end{bmatrix} \right\} $$ so the matrices $$ \begin{bmatrix} 1 & -1\\ -2 & 2 \end{bmatrix} \qquad\begin{bmatrix} 2 & 1\\ 2 & 1 \end{bmatrix} $$ are a basis for the space of the matrices that commute with $B$.

Emilio Novati
  • 62,675
2

Adapted from this answer to a very similar question.

That matrix $B$ is clearly not a multiple of the identity matrix, so its minimal polynomial is of degree${}>1$, hence equal to its characteristic polynomial (which you do not have to compute). Then by the result of this question, matrices that commute with $B$ are just the polynomials in$~B$. Given that the minimal polynomial has degree$~2$, the polynomials in $B$ are just the linear combinations of $B$ and the $2\times2$ identity matrix (filling a $2$-dimensional subspace of matrices).

1

Here is a way of finding one basis:

Let $L(A) = AB-BA$, then $A$ commutes with $B$ iff $A \in \ker L$. Using a standard basis, find the null space of $L$ and use this to determine a basis of $\ker L$.

This can be simplified a little since $B$ has a full set of eigenvectors.

Suppose $v_k,u_k$ are the left and right eigenvectors of $B$ corresponding to $\lambda_k$. Show that $u_i v_j^T$ is a basis and that $L(u_i v_i^T) = (\lambda_i - \lambda_j) u_i v_j^T$. In particular, this shows that $\ker L = \operatorname{sp} \{ u_1 v_1^T, u_2 v_2^T \} $.

By inspection, we can choose $v_1 = (2,1)^T, v_2 = (-1,1)^T$ and $u_1 =(1,1)^T, u_2 = (-1,2)^T$ to get a basis $\begin{bmatrix} 2 & 1 \\ 2 & 1 \end{bmatrix}$, $\begin{bmatrix} 1 & -1 \\ -2 & 2 \end{bmatrix}$.

Here is another way: Suppose $V^{-1} B V = \Lambda$, where $\Lambda$ is diagonal (with different entries). Then $AB=BA$ iff $ V^{-1} A V V^{-1} B V = V^{-1} B V V^{-1} A V$ iff $V^{-1} A V \Lambda = \Lambda V^{-1} A V$.

In particular, $C$ commutes with $\Lambda$ iff $V C V^{-1}$ commutes with $B$. Since $\Lambda$ is diagonal with distinct eigenvalues, we see that $C$ commutes with $\Lambda$ iff $C$ is diagonal.

Hence a basis for the set of commuting matrices is $V \operatorname{diag}(1,0) V^{-1}$, $V \operatorname{diag}(0,1) V^{-1}$.

copper.hat
  • 172,524