0

I'm trying to solve this problem.

Let A and B be square n by n matrices. Suppose that for three consecutive integers we have that $(AB)^k=A^kB^k$. Show that $AB=BA$

My thinking: I suppose exists three integers $k,k+1,k+2$ such that $(AB)^k=A^kB^k$, $(AB)^{k+1}=A^{k+1}B^{k+1}$ and $(AB)^{k+2}=A^{k+2}B^{k+2}$ by adding these equalities and substracting common factors I can get

$(AB)^k(I+AB+(AB)^2)=A^k(I+AB+(AB)^2)B^k$, Which I'm trying to figure what it means. I tried some reverse engineering by doing $(AB)^k=(BA)^k$ but that didn't got me too far.

Any hints on how to tackle this problem?

Thanks!

  • There's a version of this question for arbitrary groups, that's a notorious exercise in Herstein's Algebra text. See http://math.stackexchange.com/questions/40996/prove-that-if-abi-aibi-forall-a-b-in-g-for-three-consecutive-integers – Gerry Myerson Nov 27 '16 at 05:01
  • @GerryMyerson notably, the version for groups assume that the elements have inverses, which we cannot necessarily do in this context – Ben Grossmann Nov 27 '16 at 05:55
  • 2
    Note: We need some assumptions on $A$ and $B$ here. For instance, $$ A = \pmatrix{1&0\0&0}, \quad B = \pmatrix{0&1\0&0} $$ satisfy the equation for $k=2,3,4,\dots$ but not $k=1$. – Ben Grossmann Nov 27 '16 at 05:56
  • 1
    @Omn, right. There's a theorem of Ligh and Richoux to the effect that if $R$ is a ring with unity satisfying the identities $(xy)^n=x^ny^n$ for three consecutive positive integers $n$, then $R$ is commutative. I believe the reference is, A commutativity theorem for rings, Bull Austral Math Soc 16 (1977) 75-77. It may be available at https://www.cambridge.org/core/services/aop-cambridge-core/content/view/S0004972700023029 – Gerry Myerson Nov 27 '16 at 06:30
  • @GerryMyerson What about the example I gave above? Meant to write: but A and B don't commute – Ben Grossmann Nov 27 '16 at 06:43
  • @Omn, it's a matter of interpretation. If there exist three consecutive integers such that for all matrices $(AB)^k=A^kB^k$, then the theorem of Ligh and Richoux applies. But if all that's being asserted is that there exist matrices such that etc., etc., then your example shows that those particular need not commute. Are you reading all this, César? – Gerry Myerson Nov 27 '16 at 06:51
  • 1
    @GerryMyerson It seems he hasn't been around for 2 hours. Anyway, my interpretation is that $A$ and $B$ are supposed to be fixed matrices. This is fine if we assume that the matrices are invertible, and I'm pretty sure that's what's missing from the question. Interesting idea, though. – Ben Grossmann Nov 27 '16 at 07:03
  • There is only one example, César, the one given by @Omn, a few comments up. – Gerry Myerson Nov 27 '16 at 21:35
  • @GerryMyerson $(AB)^n=A^nB^n\Rightarrow (AB)^n-A^nB^n=0\Rightarrow A(BA)^{n-1}B-A(A)^{n-1}(B)^{n-1}B=0\Rightarrow [A[(BA)^{n-1}-(AB)^{n-1}]]B=0.$ – César Rosendo Nov 28 '16 at 21:59
  • @GerryMyerson Because $A\neq B$ and $A,B\neq 0$ does this mean that $(BA)^{n-1}=(AB)^{n-1}$ and this that BA=AB? What do you think? – César Rosendo Nov 28 '16 at 22:02
  • 2
    $AXB=0$ doesn't imply $X=0$, even with the added assumptions $A\ne B$, $A\ne0$, $B\ne0$. But why are you trying to prove something that @Omn has already proved to be false? – Gerry Myerson Nov 28 '16 at 22:12
  • @GerryMyerson Sorry, I hadn't checked the counterexample. Thanks. Probably as "Omn" said in order for that statement to hold true more conditions on A and B are needed. – César Rosendo Nov 28 '16 at 22:28

0 Answers0