0

I am thinking about whether all algebraic identities involves only addition and multiplication operations that holds for real numbers also hold for any elements that are commutative in multiplication of any rings or not.

For example, it is well-known that we have $(a+b)^2=a^2+2ab+b^2$ for all $a,b\in\mathbb{R}$. And it is because the terms $ab$ and $ba$ are combined together due to the commutativity for real numbers.

Now, for the n$\times$n identity matrix $I$ and an arbitrary matrix $A\in GL_n(\mathbb{R})$, we have also the multiplication between them are commutative. Therefore the identity $(A+I)^2=A^2+2AI+I^2$ also holds.

I am curious about whether all algebraic identities can have such a generalisation or not. I have not way to give a proof, but I can’t also figure out any counter-example.

  • 2
    multiplication of matrix is not commutative – Tim Jul 28 '23 at 09:36
  • 1
    @Tim But the case when one is the identity matrix is commutative. I’m thinking about commutative cases. – Jason Lee Jul 28 '23 at 09:38
  • 1
    @Dietrich Burde That case is because of 2ab=0, but $(a+b)^2=a^2+2ab+b^2$ is still held. – Jason Lee Jul 28 '23 at 09:41
  • Yes, you can prove them all mechanically. Such identities will be polynomials with integer coefficients. Natural coefficients initially, but if we subtract the right side from both sides, integer. Next, distribute all multiplications, fix a monomial order (and the variables in particular). Commutativity allows variables in all monomials to be ordered. Sum similar monomials. If it is an identity, all coefficients are zero. The same computation over the integers works in every commutative ring. – NDB Jul 28 '23 at 09:50
  • @JasonLee your question title and body are misleading, you say arbitrary ring. – Tim Jul 28 '23 at 09:52
  • @Tim Sorry about that. I have edited it for clarity. – Jason Lee Jul 28 '23 at 09:56
  • You should delete the matrix part. If you only allow one value $a=I$, it is no longer a binomial identity in $a$ and $b$. – Dietrich Burde Jul 28 '23 at 09:57
  • @DietrichBurde I limited the case to only commutative in multiplication. – Jason Lee Jul 28 '23 at 09:59
  • @NDB Thanks for your idea. But the point is how do I prove “all coefficients are zero”? It is easy if I am proving a specific case. But I have no way to prove the general case—arbitrary algebraic identities. – Jason Lee Jul 28 '23 at 10:03
  • Yes, this is what I mean. Matrix multiplication is not commutative, so if you limit to noncommutative, you should not write about matrices. – Dietrich Burde Jul 28 '23 at 10:03
  • @DietrichBurde But I have specified commutative multiplication. I know not all matrix multiplications are commutative, but there do exist. That’s why I used it as an example. Sorry that I couldn’t give a better example. – Jason Lee Jul 28 '23 at 10:06
  • If you "specify" a formula, it is no longer an identity. To specify $(A+B)^2$ to $B=I$ is "cheating". A binomial identity for matrix rings simply doesn't hold true. – Dietrich Burde Jul 28 '23 at 10:07
  • Maybe I have wrongly expressed. What I want is merely a statement like $(A+B)^2=A^2+2AB+B^2$ holds true if the multiplication between A and B is commutative. Would you mind telling me how can I edit the terminologies to make it clear? – Jason Lee Jul 28 '23 at 10:12
  • On a side note, you might be interested to learn about polynomial identity algebras – RougeSegwayUser Jul 28 '23 at 10:40
  • You are assuming that the identity is an identity. You must define what you mean by that, but the reasonable meanings are: (1) All coefficients are zero, when the polynomials are put in canonical form. (2) For all rational, or integer values of the variables the evaluation of the identity is $0=0$. In the first case (1) there is nothing else to prove. From (2) you can also prove (1) in many ways. You could substitute sufficient values to get a huge system of equations on the coefficients, which will have only solution $0$. – NDB Jul 28 '23 at 10:53
  • Or note that polynomials are analytic and that they are identically zero on all rationals. Again you get all coefficients zero. – NDB Jul 28 '23 at 10:57
  • This is just the universal property of polynomial rings, e.g. see here. – Bill Dubuque Jul 28 '23 at 18:09

1 Answers1

1

Here is how I understand your question: let $S(x,y)$ and $T(x,y)$ be two terms in the language of rings with free variables $x$ and $y$, and let $P$ and $Q$ be the closed formulas $$ P: \forall x,\forall y, S(x,y)=T(x,y)$$ and $$ Q: \forall x,\forall y, (xy=yx) \implies S(x,y)=T(x,y).$$

Then your question is: if $P$ is satisfied in $\mathbb{R}$, is $Q$ satisfied in all rings?

And the answer is then yes. More precisely, to the term $S$ (resp. $T$), we can associate a unique polynomial $A(x,y)\in \mathbb{Z}[x,y]$ (resp. $B(x,y)\in \mathbb{Z}[x,y]$) such that in any ring $R$, for any $a,b\in R$, the evaluation of the term $S(a,b)$ (resp. $T(a,b)$) is equal to $A(a,b)$ (resp. $B(a,b)$). Then formula $P$ holds in $\mathbb{R}$ if and only if $A=B$ as polynomials (because $\mathbb{R}$ is an infinite field), and in that case formula $Q$ holds in all rings.

Captain Lama
  • 25,743
  • That’s exactly what I meant. Thank you so much sir. And it can easily be generalised to any number of free variables by the same approach, right? – Jason Lee Jul 28 '23 at 10:57