2

I thought that my basics in Elementary Number Theory are getting weak so I started it again. While reading the introduction from a book I found the writer introduced associative, distributive and commutative properties in the starting of chapter. I took another book and found the same in first chapter.

I thought them of great use. But looking at them roughly, I got that isn't it obvious:

Commutative property:

$a+b=b+a$ and $a \cdot b = b \cdot a$

Associative property:

$(a+b)+c=a+(b+c)$ and $(a \cdot b) \cdot c = a \cdot (b \cdot c)$

Distributive properties:

$a \cdot (b+c) = a \cdot b + a \cdot c$

The only use I know is that $a\times b \ne b\times a$ (Vector product).

Now, I have two question in my mind.

$1$. What is the significance of associative, distributive and commutative properties in mathematics. (Please forgive me if I m asking a stupid question and there are a lot of uses).

$2$. How can we prove these properties, i.e.. I m asking this because if someone can ask a proof for $1+1=2$, then why not a proof for $a+b=b+a$ or $a \cdot (b+c) = a \cdot c + b \cdot c$

Thanks.

I know this kind of question are usually targeted and massively downvoted. So please leave comment after downvoting so that I can improve the bugs in my post.

  • I think you'd have to refer back to basic axioms and the definition of addition/multiplication. – Simply Beautiful Art Dec 26 '16 at 17:34
  • There are books that answer question 2, e.g., Edmund Landau's "Foundations of Analysis." He starts with the (a version of) Peano's Axioms and derives the properties of the complex numbers. Likely, there are more modern books that do the same thing. – B. Goddard Dec 26 '16 at 17:43
  • If by "uses", you mean deviation from the properties among different groups (I figured this out from "The only use I know is that $ a \times b \ne b \times a $ "), there are a lot of other "uses" too. From the top of my head (and very popular): Matrices. $A \times B \ne B \times A$. – Shraddheya Shendre Dec 26 '16 at 17:45
  • I also think this question deserves a group-theory tag. It seems to be going in that direction. – Shraddheya Shendre Dec 26 '16 at 17:46
  • Okay @ShraddheyaShendre, just leave all the cross products, now anything else that can we say?? – Vidyanshu Mishra Dec 26 '16 at 17:47
  • @ShraddheyaShendre, since I don't know anything about group theory so add if you think it is suitable. – Vidyanshu Mishra Dec 26 '16 at 17:47
  • 1
    Rotations are not commutative, vector product is not associative, and $(n,a) \mapsto a^n$ is not distributive with respect to addition, to list a few examples. Those properties are not true in general. Being good properties which often appear in structures, they deserve special names. – Aloizio Macedo Dec 26 '16 at 17:48
  • If you want to prove those properties for foundational structures such as $\mathbb{Z}$ or $\mathbb{N}$, you will probably come to some philosophical inquiries if you are new to mathematics. Maybe it would be more worthwhile to see those properties (or lack thereof) being proved to other algebraic structures, such as matrices etc. – Aloizio Macedo Dec 26 '16 at 17:50
  • @AloizioMacedo, so this time peano axioms do not suffice?? – Vidyanshu Mishra Dec 26 '16 at 17:52
  • @THELONEWOLF. PA proves commutativity and associativity for $+$ and $\times$. It says nothing a priori about other operations. – Noah Schweber Dec 26 '16 at 18:06
  • @NoahSchweber, so what is the solution. Should I convince myself by saying that the things which are needed to prove it are beyond my scope of knowledge. – Vidyanshu Mishra Dec 26 '16 at 18:08
  • @THELONEWOLF. Prove what? You can't prove that every operation is commutative/associative, because they aren't. What specific operation(s) are you interested in? – Noah Schweber Dec 26 '16 at 18:10
  • @NoahSchweber operation of integers.. – Vidyanshu Mishra Dec 26 '16 at 18:11
  • 1
    @THELONEWOLF. Which operation of integers?? Exponentiation is neither commutative nor associative. I think you're trying to prove something that isn't true. – Noah Schweber Dec 26 '16 at 18:12
  • @NoahSchweber, addition and multiplication suffice – Vidyanshu Mishra Dec 26 '16 at 19:20
  • 3
    @THELONEWOLF. Alright, what axioms do you want to use? Different axioms will require different proofs. For example, in Peano arithmetic the proofs are by induction, and are short; in ZFC, you need to first build the natural numbers (via the finite ordinals), and so this adds some complexity to the task. Incidentally, this might be useful to you. – Noah Schweber Dec 26 '16 at 19:26
  • I don't know what problem people do have. I have clearly written in question that don't forget to mention the reason for downvote but these people do not prefer to read whole question I think . – Vidyanshu Mishra Dec 26 '16 at 19:36

4 Answers4

15

It has since been clarified in the comments that the OP is specifically interested in proving the relevant properties for addition and multiplication.

Let me show how to prove that addition is commutative; hopefully you'll get the idea of how to do the other ones.

First, we need to specify our axioms. Proofs don't take place in a vacuum, and specifying one's axioms is especially important when (a) one is proving something somewhat arcane (e.g. that might depend on the axiom of choice), or (b) when one is proving something so simple that it is generally taken for granted (as here). The axioms I'll be using are those of (first-order) Peano arithmetic.

Our proof will be consist of three proofs by induction: the first on its own, and the third inside the second. This is the price of doing everything from scratch!

First, we establish that $0+x=x$ for all $x$. (That $x+0=x$ for all $x$ is one of the Peano axioms.) To do this, let $\psi(x)$ be the formula "$0+x=x$". We have that $\psi(0)$ holds, that is, $0+0=0$, since $x+0=x$ for all $x$ (including $x=0$). Now suppose $\psi(n)$ holds; we want to calculate $0+S(n)$. Well, one of our axioms is $$a+S(b)=S(a+b),$$ so applying that here gives $$0+S(n)=S(0+n);$$ but our induction hypothesis means that $S(0+n)=S(n)$. So we've shown that $\psi(n)\implies \psi(S(n))$; by the induction scheme, then, this gives $\forall n\psi(n)$.

Now we'll use that to prove the commutativity of addition. Let $\varphi(x)$ be the formula "For all $y$, $x+y=y+x$". By the above result, for all $y$ we have $y+0=0+y=y$, so $\varphi(0)$ holds. Now suppose $\varphi(n)$ holds; we want to compare $S(n)+y$ and $y+S(n)$ for arbitrary $y$.

Well, a previously-mentioned axiom allows us to compute the latter: $y+S(n)=S(y+n)$, and by our induction hypothesis this is just $S(n+y)$. So now we just need to compute the former. This will need one last proof by induction, this time with $n$ as a parameter:

Let $\chi(x)$ be the formula "$S(n)+x=S(n+x)$". Clearly we have $\chi(0)$, since $S(n)+0=S(n)=S(n+0)$ (since $a+0=a$ is an axiom). Now suppose $\chi(m)$ holds, and let's look at $S(n)+S(m))$. By the previously-mentioned axiom relating successor and addition, this is $S(S(n)+m)$; by the induction hypothesis, this is just $S(S(n+m))$. But $S(n+m)=n+S(m)$, so this is just $S(n+S(m))$. So $\chi(S(m))$ holds. By the induction scheme, this shows $\forall x\chi(x)$.

In particular, $S(n)+y=S(n+y)$, and we already showed that $y+S(n)=S(n+y)$; so we've shown that $\varphi(n+1)$ holds. So by the induction scheme, this shows $\forall x\varphi(x)$ - that is, addition is commutative!

Hopefully you can see how similar arguments can be used to prove the other basic properties about addition and multiplication.


Associativity, commutativity, distributivity, and the like are not properties of general functions, or even of functions on the integers (or even "nice" functions!). Rather, they are properties a function may or may not have. Although many functions do have these properties, the general picture is more complicated.

I think considering some natural examples of non-commutative and non-associative operations will be helpful:

  • The most basic example is exponentiation! This is neither commutative nor associative, and does not distribute nicely over multiplication or addition:

    • $2^3=8\not=9=3^2$.

    • $(2^2)^3=4^3=64\not=256=2^8=2^{2^3}$.

    • $2^{3\times 3}=2^9=512\not=64=8\times 8=2^3\times 2^3$.

    • Etc. (Note that exponentiation does satisfy some laws that look vaguely like distributivity: namely right distributivity $(xy)^z=x^zy^z$, as well as the laws $x^{y+z}=x^yx^z$ and $(x^y)^z=x^{yz}$. And these laws, of course, are extremely useful.)

  • Composition of linear transformations (or matrix multiplication) is not commutative. A linear transformation $T$ is a map from a vector space to itself (actually, linear maps can go from one vector space to another, but then "swapping the order of composition" may not even make sense: given $f: U\rightarrow V$ and $g: V\rightarrow W$, what should $f\circ g$ be?), with certain nice properties. Think rotation, reflection, dilation, etc. (Actually, affine transformations will more closely match your intuition - linear transformations have to fix the origin - but whatever.) It's a good exercise to come up with an example of this.

    • In general, if I have two functions $f, g$ from a set to itself, $f\circ g$ need not equal $g\circ f$.
  • Multiplication of quaternions is not commutative.

    • And multiplication of octonions isn't even associative! But octonions are less useful than quaternions.
  • Although Boolean algebras satisfy a distributive law, general lattices need not; and there are lots of important non-distributive lattices (e.g. the lattice of normal subgroups of a group need not be distributive, although it will always satisfy the weaker modularity property).


You also ask how you prove that something is commutative or associative. Well, you prove that (assuming it's true!) the same way you prove anything else: using the definitions, together with axioms and things you've already proved.

For example, consider the binary operation $*$ on $\mathbb{R}^2$ defined as $$(a, b)*(c, d)=(ac-bd, bc+ad)$$ (you might recognize this as complex multiplication in disguise). Then we can show that $*$ is commutative as follows: $$(a, b)*(c, d)=(ac-bd, bc+ad)=(ca-db, cb+da)=(c, d)*(a, b).$$ The first and third equalities are just the definition of $*$, and the middle equality uses the commutativity of standard multiplication of real numbers (which you either take for granted, or prove first).

Noah Schweber
  • 245,398
6

Let me try to give as clear an answer as I can.

The fact that multiplication and addition (of real numbers) are both associative and commutative is extremely important. There are many algebraic structures with operations that are not commutative or not associative; those structures have different properties than the set of real numbers. Indeed it would be almost impossible to prove any properties of the set of real numbers without using associativity and commutativity. One of the ways of characterizing the set of real numbers is that it is a complete ordered field (and, up to isomorphism, there is only one complete ordered field). There's a lot of technical detail packed inside the phrase "complete ordered field" but part of it is "field", which means an algebraic structure with two operations, both commutative and associative (and satisfying several other properties, like commutativity, existence of inverses, etc.)

You mentioned cross product as an example of an operation that is not commutative; you could also add matrix multiplication (not commutative), multiplication of quaternions (not commutative), and multiplication of octonions (not commutative or associative). These structures have very different properties from the set of real numbers; the theorems that one can prove in them are different theorems, because they differ at a fundamental level.

To your second question: These properties can be proven, but to do so you have to dig down to a more fundamental level. Any mathematical theory requires some postulates -- without them there is nothing to build a proof out of. One approach to building up the reals from scratch involves starting with the Peano axioms for the natural numbers, defining addition and multiplication recursively, and then proving that the operations so defined are associative and commutative. Then one defines integers (as equivalence classes of ordered pairs of natural numbers), shows that the naturals can be identified with a subset of the integers, defines addition and multiplication of integers (which requires lots of detail, because you have to show that the operations are well-defined and extend the definitions for natural numbers), and proves that in this new setting they are still associative and commutative. Then one bootstraps up to the rationals using a similar process. Then one defines real numbers as Dedekind cuts, defines what addition and multiplication of Dedekind cuts means, and finally proves that addition and multiplication are still associative.

It's a long haul. If you prefer a shorter journey, you can simply start by postulating that the set of rational numbers is a field, and work your way up from there. Or you can start by simply postulating the existence of a complete ordered field, which bakes the properties into your axioms.

mweiss
  • 23,647
3

Your two questions require two independent answers.

Significance of associative, distributive and commutative properties in mathematics. These elementary properties of the natural numbers led mathematicians to define new algebraic structures (notably rings and semirings) of considerable influence in algebra and in mathematics in general.

To give you an elementary (but useful!) example, consider the set $B = \{0, 1\}$ equipped with the following operations: $0+0= 0$, $0 + 1 = 1 + 0 = 1 + 1 = 1$, $1 * 1 = 1,$ $1 * 0 = 0 *1 = 0 * 0 = 0.$ You can verify by hand that both $+$ and $*$ are associative and commutative and $*$ distributes over $+$. This set governs the basic of logics: just think of $0$ as "false", $1$ as "true", $+$ as "or" and $*$ as "and".

How can we prove these properties?

It depends where you start from. Sometimes, just like in my example, you just have to verify the properties by hand. In other cases, for instance, if you take polynomials or matrices with integer coefficients, you need some form of mathematical argument to prove your properties. To come back to your example, before proving that $a \cdot (b + c) = a \cdot b + a \cdot c$, you need to think how you define sum and product. Logicians have developped sophisticated methods to do so but is not an elementary task. If you want to have a flavour of the kind of techniques you need for that, look how addition is defined in this wikipedia entry about recursive functions.

J.-E. Pin
  • 40,163
2

As others have noted, these are properties that some functions have and others don't. For example: addition and multiplication of real numbers are commutative and associative; while subtraction, division, exponentiation, and radicalization are not. While these facts can be proven from more basic assumptions (with some extensive labor), most basic algebra texts find it useful to assume these as a starting point, and so take them axiomatically.

They are useful because they establish many common writing rules that allow us to simplify and standardize how expressions are written (and know that what we have written is, in fact, equivalent). A small subset of examples in real numbers:

  • We can use commutativity of addition to write polynomials in standard form with decreasing exponents: $3x + 5x^2 = 5x^2 + 3x$.
  • We can use commutativity of multiplication to rewrite factors in convenient alphabetical order: $ba = ab$.
  • Distribution allows us to remove parentheses, effectively getting rid of one phase in the order of operations: $3(a + b) = 3a + 3b$.
  • Distribution and commutativity are also used to establish how we combine like terms: $3x + 5x = 8x$ because $8x = (3+5)x = x(3+5) = 3x + 5x$.
  • More generally, combining many like terms on sight is so easy because it is justified by a combination of these properties: $3x + 5y + 6x = (3x + 5y) + 6x = 3x + (5y + 6x) = 3x + (6x + 5y) = (3x + 6x) + 5y = 9x + 5y$.
  • Likewise: $(a+b)(a-b) = a^2 - b^2$ because: $(a+b)(a-b) = (a+b)a + (a+b)(-b) = a(a+b) + (-b)(a+b) = a^2 +ab - ba - b^2 = a^2 + ab - ab - b^2 = a^2 + 1ab - 1ab + b^2 = a^2 + (1-1)ab - b^2 = a^2 + 0ab - b^2 = a^2 + 0 - b^2 = a^2 - b^2$.

For many of us, these operations have been drilled in during our school years so that they seem visually "obvious" (which is good and convenient), but when you dig into the specific details they're all justified by specific combinations of the commutative, associative, and distributive properties on real numbers. So it's a fairly convenient starting point for a basic algebra text to boil it down to just these three starting assumptions and be able to build everything else from that foundation.