3

To take a very simple example: imagine the sum 1 + 2 + 3 + 4. You can do this one step at a time: 1 + 2 = 3, then 3 + 3 = 6, then 6 + 4 = 10. It does not matter how long it takes you to move from one step to the next. The answer will always be the same. It is always a question with 3 operations, with each operation involving no more than two operands.

Are there any operations in mathematics that must be applied at the same time to more than two operands? Or is mathematics intrinsically a linear set of (unary and) binary operations?

1 Answers1

2

A classical result of Sierpinski is that every $n$-ary operation on a set is a finite composition of binary operations on the set, see W. Sierpinski, Sur les fonctions de plusieurs variables, Fund. Math. $\,33\, (1945),\, 169\!-\!173.$ See this answer for further references.

The proof is quite simple for operations on a finite set $\rm\,A.\,$ Namely, if $\rm\,|A| = n\,$ then we may encode $\rm\,A\,$ by $\rm\,\mathbb Z/n,\,$ the ring of integers $\rm\,mod\ n,\,$ allowing us to employ Lagrange interpolation to represent any finitary operation as a finite composition of the binary operations $\rm\, +,\ *,\,$ and $\rm\, \delta(a,b) = 1\,\ if\,\ a=b\,\ else\,\ 0,\ $ namely

$$\rm f(x_1,\ldots,x_n)\ = \sum_{(a_1,\ldots,a_n)\ \in\ A^n}\ f(a_1,\ldots,a_n)\ \prod_{i\ =\ 1}^n\ \delta(x_i,a_i) $$

When $\rm\,|A|\,$ is infinite one may instead proceed by employing pairing functions $\rm\,A^2\to A.$

Bill Dubuque
  • 272,048
  • Thanks for the link to the earlier question. I see that @ZhenLin asks "So one might ask: does every algebraic theory admit a presentation using only operations of arity at most 2? Unfortunately, I do not know the answer to this." If this were indeed the case, then would it be meaningful to say that mathematics exists outside any concept of time? – James Newton Apr 07 '15 at 16:00