72

Let $ A, B $ be two square matrices of order $n$. Do $ AB $ and $ BA $ have same minimal and characteristic polynomials?

I have a proof only if $ A$ or $ B $ is invertible. Is it true for all cases?

Andy
  • 2,246
  • 5
    The coefficients of the characteristic polynomial are continuous functions in the entries of a matrix, so if the characteristic polynomials of $AB$ and $BA$ coincide for a dense set of $A$ (or a dense set of $B$) then they always coincide. The coefficients of the minimal polynomial, on the other hand... – Qiaochu Yuan Feb 22 '13 at 21:12
  • @cmi obviously not. Try to figure out 2 different polynomials with the same set of roots. Is not hard. – Gaston Burrull Jul 04 '19 at 07:41
  • 1
    http://people.math.sc.edu/howard/Classes/700/charAB.pdf – Bach Aug 16 '19 at 19:32

12 Answers12

67

Before proving $AB$ and $BA$ have the same characteristic polynomials show that if $A_{m\times n}$ and $B_{n\times m} $ then characteristic polynomials of $AB$ and $BA$ satisfy following statement: $$x^n|xI_m-AB|=x^m|xI_n-BA|$$ therefore easily conclude if $m=n$ then $AB$ and $BA$ have the same characteristic polynomials.

Define $$C = \begin{bmatrix} xI_m & A \\B & I_n \end{bmatrix},\ D = \begin{bmatrix} I_m & 0 \\-B & xI_n \end{bmatrix}.$$ We have $$ \begin{align*} \det CD &= x^n|xI_m-AB|,\\ \det DC &= x^m|xI_n-BA|. \end{align*} $$ and we know $\det CD=\det DC$ if $m=n$ then $AB$ and $BA$ have the same characteristic polynomials.

user26857
  • 52,094
M.H
  • 11,498
  • 3
  • 30
  • 66
  • 5
    I understood the proof, it's nice.. but is there any intuition abt why we consider C and D in such way? – ogirkar May 08 '21 at 07:00
  • @Believer Maybe some intuition is that right multiplication gives column operation, and left multiplication give left multiplication. So $CD$ is related to giving the thing related to $AB$, and $DC$ gives $BA$. This also means that $E=\begin{pmatrix} I_m & -A\ 0 & xI_n\end{pmatrix}$ would also work for the purpose of the proof. (Also you are trying to do row or column reduction to make $C$ triangular, hence the choice of $-B$ or $-A$ at the corresponding positions.) – Three aggies Feb 23 '24 at 16:22
61

If $A$ is invertible then $A^{-1}(AB)A= BA$, so $AB$ and $BA$ are similar, which implies (but is stronger than) $AB$ and $BA$ have the same minimal polynomial and the same characteristic polynomial. The same goes if $B$ is invertible.

In general, from the above observation, it is not too difficult to show that $AB$, and $BA$ have the same characteristic polynomial, the type of proof could depends on the field considered for the coefficient of your matrices though. If the matrices are in $\mathcal{M}_n(\mathbb C)$, you use the fact that $\operatorname{GL}_n(\mathbb C)$ is dense in $\mathcal{M}_n(\mathbb C)$ and the continuity of the function which maps a matrix to its characteristic polynomial. There are at least 5 other ways to proceed (especially for other field than $\mathbb C$).

In general $AB$ and $BA$ do not have the same minimal polynomial. I'll let you search a bit for a counter example.

user26857
  • 52,094
  • 15
    5 other ways? I'm quite curious what those ways are. I only know of the continuity argument and an argument involving determinant identities on block matrices. Would it be possible to provide a reference to some other methods? – EuYu Feb 22 '13 at 17:37
  • 1
    @EuYu, I have no reference, sorry. – Nathan Portland Feb 22 '13 at 22:11
  • @EuYu one other method is the argument that matrices $A=(a_{ij})$ and $B=(b_{ij})$ are invertible over the field $K(a_{ij}, b_{ij})$ – Bananach Feb 15 '20 at 17:58
  • 1
    See this short note by JH Williamson from 1953 https://www.cambridge.org/core/services/aop-cambridge-core/content/view/S0950184300003104 – Bananach Feb 15 '20 at 18:09
30

Hint: Consider $A = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}$ and $B = \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}$. What do you get in that case?

Jim
  • 30,682
  • 12
    This shows that $AB$ and $BA$ have different minimal polynomial. But characteristic polynomials are the same right? –  Nov 25 '17 at 01:39
15

Yes, $AB$ and $BA$ have the same characteristic polynomial.

Basic facts: $\det(A^T) = \det(A)$, $\det(AB) = \det(A) \det(B)$

  1. $A$ and $A^T$ share the same characteristic polynomial.

\begin{align*} \det(xI-A) = \det((xI-A)^T) = \det(xI-A^T) \end{align*}

  1. Similar matrices have the same characteristic polynomial. If $B = PAP^{-1}$,

\begin{align*} \det(xI - B) &= \det(xI - PAP^{-1}) \\ &= \det(P(xI - A)P^{-1}) \\ &= \det(P)\det(xI - A)\det(P^{-1}) \\ &= \det(xI - A) \end{align*}

  1. Determinant of a block triangular matrix (A special case of Schur's formula):

\begin{align*} \det \begin{pmatrix}A & B \\0 & C\end{pmatrix} = \det(A) \det(C) \end{align*}

Using block multiplication, please verify that $\begin{pmatrix}I & -A \\0 & I\end{pmatrix} \begin{pmatrix}AB & 0 \\B & 0\end{pmatrix} = \begin{pmatrix}0 & 0 \\B & BA\end{pmatrix} \begin{pmatrix}I & -A \\0 & I\end{pmatrix}$.

Therefore, the matrices $\begin{pmatrix}AB & 0 \\B & 0\end{pmatrix}$ and $\begin{pmatrix}0 & 0 \\B & BA\end{pmatrix}$ are similar, and have the same characteristic polynomial.

\begin{align*} \det\left[x\begin{pmatrix}I & 0 \\0 & I\end{pmatrix} - \begin{pmatrix}AB & 0 \\B & 0\end{pmatrix}\right] &= \det(xI - AB) \det(xI) \end{align*} \begin{align*} \det\left[x\begin{pmatrix}I & 0 \\0 & I\end{pmatrix} - \begin{pmatrix}0 & 0 \\B & BA\end{pmatrix}\right] &= \det(xI) \det(xI - BA) \end{align*}

And there it is. But $AB$ and $BA$ do not need to have the same minimal polynomial. See Jim's answer for a counterexample.

Bio
  • 835
  • 6
  • 13
10

For square matrices, the characteristic polynomials are same, but for $A$ a matrix of size $m \times n$ and $B$ a matrix of size $n \times m$ we have $x^{m}C_{BA}(x)=x^{n}C_{AB}(x)$. This implies that the nonzero eigenvalue of $AB$, counted with multiplicities, are same as nonzero eigenvalue of $BA$.

That is, if $A$ is of size 7×4 and $B$ is of size 4×7 and assume that the 4×4 matrix $BA$ has nonzero eigenvalues 1,1,3 so fourth eigenvalue of $BA$ is 0. Then the 7×7 matrix $AB$ will also have nonzero eigenvalue 1,1,3 and remaining four eigenvalue of $AB$ are zero.

user26857
  • 52,094
10

It's not true that their characteristic polynomials will be the same in the general case. The best result in this general vein is the following.

Let $A\in\mathbb{F}^{m \times n}$ and let $B\in\mathbb{F}^{n \times m}$, and $AB$, $BA$ with minimal polynomials (over $\mathbb{F}$) $m_{AB}(x)$ and $m_{BA}(x)$ respectively. Then one of the following holds:

$m_{AB}(x) = m_{BA}(x)$, or $m_{AB}(x) = x \cdot m_{BA}(x)$, or $x\cdot m_{AB}(x) = m_{BA}(x)$.

It's easy, just use the fact that $(BA)^k=B(AB)^{k-1}A$.

user26857
  • 52,094
  • in general means for m not equal to n?? – Ri-Li Sep 16 '14 at 21:16
  • 7
    The question says (and always has said) square matrices $A,B$. If you are answering a more general question, then you should announce this. Also, answering a more general question is only useful if this is no more difficult than the actual question, or if the more general solution sheds more light on the solution. – Marc van Leeuwen Dec 23 '15 at 21:10
9

There are a lot of proofs for characteristic polynomials to be same. I want to provide mine. It may be more complicated, but it is less "consider magic product of matrices".

Let $\chi_M(x)$ denotes a characteristic polynomial $\chi_M(x) = \det(x - M)$.

For square matrices $A$ and $B$ we have $\det(AB - x) = \det(BA - x) \Leftrightarrow \chi_{AB}(x) = \chi_{BA}(x)$. We prove this by considering separately the two cases $\det(A)=0$ and $\det(A)\neq0$:

  1. If $\det(A) \neq 0$, then the statement follows from $$\det(AB - x) = \det(A^{-1}A)\det(AB - x) \\= \det(A^{-1})\det(AB - x)\det(A) = \det(BA - x).$$
  2. If $\det(A) = 0$, there are finite number of $s \in \mathbb R$ such that $\chi_A(s)=0$, because $\chi_A(s)$ is a finite-degree polynomial. Then there are infinitely many $s$ such that $\chi_A(s) \neq 0$. For all such $s$ we know $\chi_{(A-s)B}(x) = \chi_{B(A-s)}(x)$ as a result of a previous case. For every fixed $x$ we see two finite-degree polynomials ($x$ is fixed, $s$ is variable) $\chi_{(A-s)B}(x)$ and $\chi_{B(A-s)}(x)$ which are equal in infinite number of points. Then we conclude they are equal at every $s$. At $s = 0$ we get the result $\chi_{AB}(x) = \chi_{BA}(x)$ at every $x$.

This proves the statement for squared matrices.

Key fact (proof below): If $A$ is $m\times n$, $B$ is $n\times m$ and $n \geq m$ then $\chi_{BA}(x) = \lambda^{n-m}\chi_{AB}(x)$.

Consider $n\times n$ matrices $A' = \left(\dfrac{A}{0}\right)$ and $B' = (B\mid0)$. We just put zero rows and columns to make matrices $n\times n$.

First, $B'A' = BA \Rightarrow x - B'A' = x - BA \Rightarrow \chi_{B'A'}(x) = \chi_{BA}(x)$

Second, $A'$ and $B'$ are square matrices. Then due to the fact above we have $\chi_{B'A'}(x) = \chi_{A'B'}(x)$.

Third, $\chi_{A'B'}(x) = det(x - A'B') = det\begin{pmatrix}x - AB & 0 \\ 0 & \begin{matrix}x & 0 & \ldots & 0 \\ 0 & x & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & x \\\end{matrix}\end{pmatrix} = det(x - AB)x^{n - m} = x^{n-m}\chi_{AB}(x)$

So, we see $\chi_{BA}(x) = \chi_{B'A'}(x) = \chi_{A'B'}(x) = x^{n-m}\chi_{AB}(x)$

glS
  • 6,818
  • 1
    @user264745 Александр Тряпицын's proof of 2. is correct: For every fixed $x$, he said he views $\chi_{(A-s)B}(x)$ and $\chi_{B(A-s)}(x)$ as polynomials in $s$ which coincide on an infinite set, and concludes they are equal. The only trouble with this proof is that is works only on an infinite field. But this can be repaired by considering the infinite field "universal" for this situation: the field $K$ of rational fractions with rational coefficients and $2n^2$ indeterminates $X_{i,j},Y_{i,j}$, and the universal matrices $A,B\in M_n(K)$ whose entries are these indeterminates. – Anne Bauval Jan 12 '23 at 11:21
  • 1
    I was about to type that "universal proof" as a new answer, but I discovered it was already given here by @anon – Anne Bauval Jan 12 '23 at 11:44
  • @AnneBauval Unfortunately “universal proof” is far from my reach. I don’t know anything about field extension. – user264745 Jan 12 '23 at 12:03
  • @AnneBauval Thank you for explicitly stating lemma used in proof and took your time to reply my comments. Should I delete my speculative first comment? – user264745 Jan 12 '23 at 14:06
  • @AnneBauval Why did you delete statement lemma? Lol I wanted it. Your phrasing is better than mine. So can you please rewrite it? – user264745 Jan 12 '23 at 15:30
  • 1
    @user264745 I wanted to clean up this section and I thought my previous comment was sufficient. But if you take me by flattery, ok I shall rewrite it. And most other comments can be deleted please – Anne Bauval Jan 12 '23 at 15:39
  • 1
    If $P(X),Q(X)\in\Bbb R[X]$ are such that $P(s)=Q(s)$ for infinitely many values of $s$ (or only for more than $\deg(P-Q)$ values) then $P(X)=Q(X)$ (i.e. their coefficients are equal) hence $P(s)=Q(s)$ for every $s.$ Александр Тряпицын applies this to $P(X)=\chi_{(A-X)B}(x),Q(X)=\chi_{B(A-X)}(x)$ for $x$ fixed. – Anne Bauval Jan 12 '23 at 15:45
4

$\newcommand{\tr}{\operatorname{Tr}}$I’d like to add a more computational perspective to the elegant, more analytic proofs given here.

Because we can reconstruct the characteristic polynomial from the traces, it actually suffices to show that $\tr(AB)=\tr(BA),\tr((AB)^2)=\tr((BA)^2),\cdots,\tr((AB)^n)=\tr((BA)^n)$ to say that $\chi_{AB}=\chi_{BA}$ for two $n$-square matrices $A,B$.

But those trace equalities are easy and well known. $\tr(AB)=\tr(BA)$ follows from straightforward computation, and then (say) to show: $$\tr((AB)^4)=\tr(ABABABAB)=\tr(BABABABA)=\tr((BA)^4)$$All you have to do is use a transposition: if $X:=A$ and $Y:=BABABAB$ then we want to show $\tr(XY)=\tr(YX)$, which is true!

This should work for matrices over any integral domain.

FShrike
  • 40,125
  • No, this argument does not work for matrices over any ring. You need $1,2,\ldots,n$ to be non-zero-divisors in order to be able to reconstruct the characteristic polynomial from the traces. – darij grinberg Aug 16 '23 at 12:16
  • You did in the last sentence of this answer, even though the question indeed didn't :) – darij grinberg Aug 16 '23 at 12:21
  • oh my apologies I forgot I ever wrote that – FShrike Aug 16 '23 at 12:25
  • @darijgrinberg my bad. I have made a hopefully acceptable fix – FShrike Aug 16 '23 at 12:25
  • Actually, nope, "integral domain" is not enough (finite fields are integral domains!). But you can say "commutative $\mathbb{Q}$-algebra" for example. – darij grinberg Aug 16 '23 at 12:48
  • shame. My abstract algebra is quite weak. Would "integral domain of characteristic zero" work? @darijgrinberg – FShrike Aug 16 '23 at 12:54
  • The transposition trick does work if you use generic matrices and determinant instead of trace: just cancel out $\det(A)$ on both sides of $$\det(xI-AB)\det(A)=\det(xA-ABA)=\det(A)\det(xI-BA).$$ If you don’t want to use generic matrices, you may also use $A-tI$ in place of $A$ in the previous argument, where $t$ is an indeterminate that is algebraically independent of $x$. I remember seeing this proof on this site (in an answer/comment by Bill Dubuque) and also in a printed article (probably in AMM), but I cannot locate the sources. – user1551 Aug 16 '23 at 13:02
  • Ooh, the article was not from AMM. It’s a short paper written by Williamson. I actually first learnt about this from a user comment on this page! – user1551 Aug 16 '23 at 13:12
  • @FShrike: Yes, that should do. (Though any integral domain can be embedded into a field, so this is only mildly more general than the case of a field of characteristic $0$.) – darij grinberg Aug 16 '23 at 13:17
  • 1
    With small modification this argument gives the result over any commutative ring. The argument should be: work over polynomial ring $\mathbb Z[\mathbf x]$-- Newton's Identities work fine here. We conclude that the sum of $r\times r$ principal minors of $(AB)$ minus the sum of $r\times r$ principal minors of $(BA)$ is the zero polynomial. This survives when we do a substitution homomorphism $\phi: \mathbb Z[\mathbf x]\longrightarrow R$ for arbitrary commutative ring $R$, and allows us to conclude $AB$ and $BA$ with components in $R$ have the same characteristic polynomials. – user8675309 Jan 08 '24 at 17:44
3

Here is another proof, based on exterior algebras. (This should go through for any free finitely generated module $M, N$ over a commutative ring $A$, but not too sure.)

Lemma. Let $T$ be an endomorphism over $M$, with $\dim_A M = m$. $$ \det (xI - T) = \sum_{k = 0}^\infty \text{Tr} (\bigwedge^k(T)) (-1)^k x^{m-k}$$

This can be found in e.g. N. Bourbaki, Algebra I, Chapter 3, $\S$8, no. 5, Proposition 11. Now $\bigwedge^k$ commutes with composition (i.e. a functor), so we may say for $A$-linear maps $T: M \to N$ and $S : N \to M$,

$$\text{Tr}\bigwedge^k(TS) = \text{Tr}\bigwedge^k(T) \bigwedge^k(S) = \text{Tr}\bigwedge^k(S)\bigwedge^k(T) = \text{Tr} \bigwedge^k(ST) $$

Now we can prove the theorem: we have $$ \det (xI - ST) = \sum_{k = 0}^\infty \text{Tr}(\bigwedge^k(TS))(-1)^k x^{m - k} $$

Consider the case $\dim_A N = n \geq m$: we multiply by $x^{n - m}$, whence we have $$ \det (xI - ST)x^{n-m} = \det(xI - TS)$$ When $n < m$, all terms with $n <k$ vanishes since $\bigwedge^k (N) = \{0\}$. Thus we have

$$ \det (xI - ST) = x^{m - n} \sum_{k = 0}^n\text{Tr}(\bigwedge^k(TS))(-1)^k x^{n - k} = x^{m - n} \det (xI - TS)$$

In all cases, we have proved $\det (xI- ST)x^{n - m} = \det (x I - TS)$.

l3jfej
  • 31
  • 2
  • "whence we have" --> "whence we need to show". – darij grinberg Aug 16 '23 at 13:19
  • Nice proof, although it doesn't quite generalize to finitely generated modules. Finitely generated projective modules of constant rank should do, probably. Maybe we can do away with the constant rank condition if we pass to the "reverse characteristic polynomial" $\det\left(I-xA\right)$. – darij grinberg Aug 16 '23 at 13:21
0

[The result follows from ${ m = n }$ case of the argument here]

enter image description here

Additional lensdump link: https://lensdump.com/i/sChcBM


[Same argument expanded]

Let ${ A \in \mathbb{F} ^{m \times n}, B \in \mathbb{F} ^{n \times m} }$ with ${ m \leq n }.$ Goal is to relate characteristic polynomials of ${ AB, BA }.$

Consider the composite matrix ${ \begin{pmatrix} I _m &A \\ B &I _n \end{pmatrix}. }$

Deleting ${ B }$ using row operations gives $${ \begin{pmatrix} I _m &0 \\ -B &I _n\end{pmatrix} \begin{pmatrix} I _m &A \\ B &I _n \end{pmatrix} = \begin{pmatrix} I _m &A \\ 0 &{\color{green}{I _n - BA}}\end{pmatrix} }.$$Deleting ${ B }$ using column operations gives $${ \begin{pmatrix} I _m &A \\ B &I _n \end{pmatrix} \begin{pmatrix} I _m &0 \\ -B &I _n \end{pmatrix} = \begin{pmatrix} {\color{purple}{I _m - AB}} &A \\ 0 &I _n \end{pmatrix}. }$$ To compare characteristic polynomials, one would want the green entry to be ${ {\color{green}{x I _n - BA}} }$ and purple entry to be ${ {\color{purple}{x I _m - AB}} }$ instead.
So the following modification of above two equations works: $${ \begin{pmatrix} I _m &0 \\ -B &{\color{green}{x}}I _n \end{pmatrix} \begin{pmatrix} {\color{purple}{x}}I _m &A \\ B &I _n \end{pmatrix} = \begin{pmatrix} x I _m &A \\ 0 &{\color{green}{x I _n - BA}}\end{pmatrix}, }$$ and $${ \begin{pmatrix} {\color{purple}{x}}I _m &A \\ B &I _n \end{pmatrix} \begin{pmatrix} I _m &0 \\ -B &{\color{green}{x}}I _n \end{pmatrix} = \begin{pmatrix} {\color{purple}{x I _m - AB}} &xA \\ 0 &xI _n \end{pmatrix} .}$$Taking determinants gives $${ x ^m \det({\color{green}{x I _n - BA}}) = x ^n \det({\color{purple}{x I _m - AB}}) }$$ as needed.

0

The following are two proofs by Ichiro Satake:

enter image description here

0

I intend to dumb down a bit modified version of the highly abstract proof presented by @Anon here.

Let $\mathbb F$ be an arbitrary field and we pick matrices $A$ and $B$ from $\mathbb F^{n\times n}$.

Case 1: $B$ is invertible.
Check that $AB=B^{-1}(BA)B$ and hence $AB\sim BA$. Now use the fact that similar matrices have the same characteristic polynomial.

Case 2: $B$ is not invertible.
What do we do? We will make $B$ invertible in one way or the other :)
Consider the polynomial ring $\mathbb F[x_1, x_2, \ldots, x_{n^2}]$. It's an integral domain so we can embed it in its quotient field $\mathbb K=\mathbb F(x_1, x_2, \ldots, x_{n^2})$ i.e., the field of rational functions in $n^2$ indeterminates (variables) over $\mathbb F$.
Now, we will construct an invertible matrix $B' \in \mathbb K^{n\times n}$.
Note that each entry of this matrix is a rational function. We define $(i,j)$-th entry of $B'$ to be $x_{(i-1)n+j}$. $B'=\begin{pmatrix}x_1 & x_2 &\ldots & x_n\\ x_{n+1} & x_{n+2} & \ldots & x_{2n}\\ \vdots&\vdots&&\vdots\\ x_{(n-1)n+1} &\ldots&\ldots&x_{n^2}\end{pmatrix}\tag*{}$ Clearly, polynomials in entries of $B'$ are linearly independent. In particular, $\det(B')\neq 0$ and $B'$ is invertible.
We can treat $A$ as a matrix in $\mathbb K^{n\times n}$ where each entry is a constant polynomial.
Using the result proven in case 1, we see that $AB'$ and $B'A$ have the same characteristic polynomial. It's just that we have a different field. The result still holds.
Thus, we have $\det(xI-AB')=\det(xI-B'A)\tag{01}$ where $x$ is a scalar in $\mathbb K$ and $I$ is the identity matrix in $\mathbb K^{n\times n}$.
Since polynomials are function and we have a polynomial in two different forms, we can evaluate either of them at certain point and obtain the same value.
If we evaluate the polynomials $(01)$ at $x_1=B_{11}$, $x_2=B_{12}$,$\ldots$, $x_{n}=B_{1n}$,$\ldots$, $x_{n^2}=B_{nn}$, we can replace $B'$ by $B$. This gives us: $\det(xI-AB)=\det(xI-BA)\tag{02}$

Hence, proved. $\blacksquare$