4

We know that we can't define division by zero "in any mathematical system that obeys the axioms of a field", because it would be inconsistent with such axioms.

(1) Why can we define $a^0$ ($a\neq 0$) to be $1$? Is it possible to prove that such definition is consistent with any rule of arithmetic? How to conclude that to define $a^0$ ($a\neq 0$) we don't need abolish any other basic rule of arithmetic?

(2) More generally, how to know if a definition is consistent with a given mathematical theory?

Pedro
  • 18,817
  • 7
  • 65
  • 127

2 Answers2

4

There is no general algorithm for determining when a theory is consistent. That is a huge topic which includes Godel's incompleteness theorems. But your specific question is easier.

In Peano arithmetic (with axioms stated using $+,\times$) an exponential function $x^y$ can be defined by recursion $x^0=1$ and $x^{s(y)}=x\times x^{y}$. The axioms prove that functions can be defined recursion. So if you believe (as nearly everyone does) that Peano arithmetic (with axioms stated using $+,\times$) is consistent, then you must believe the extension with that exponential function is consistent.

Since your question mentions basic rules of arithmetic I answered in terms of Peano Arithmetic. If you merely want consistency with the field axioms the question is simpler yet: The field of integers modulo 2 proves consistency of those axioms plus $x^1=x$ and $x^0=1$, by giving a finite model. But this includes very little of arithmetic and notably does not include $x^{(y+z)}=x^y\times x^z$. See "finite field" on Wikipedia.

  • If there is no obvious problem with a definition (like there is with $0/0=1$), we can't say anything about its consistency with respect a given set of axioms? There is no mathematical reason (rather than belief) to argue that the definition $x^0=1$ is consistent with field axioms? – Pedro Aug 17 '14 at 05:04
  • 1
    @Pedro Of course there is a great deal to say about consistency of theories that have no obvious problems; and logic journals annually publish thousands of pages of research on such questions. I mentioned that is a huge topic. And I gave you a mathematical proof that if Peano Arithmetic with $+,\times$ is consistent then so is it with exponentiation including $x^0=1$. What more do you want to know? Sadly, I must say that if you do not believe PA is consistent -- then you should not believe PA with exponentiation is either. – Colin McLarty Aug 17 '14 at 06:16
-3

We don't actually define $1^0$ to be 1. That's its value. Likewise $0^0=1$ is a derived value, the supposed indefinite values all rely on the same $0/0$ proof that lets $0=1$. If you take the limit of $x^{ax}$ as x-> 0, the limit is 1 for all a.

When you ask how one plans to go towards zero, a step right is either by way of root, eg square-root, or by way of division.

Roots are a matter of dividing a positive number by a positive number, and this never goes to zero.

Division implies $0/0$ is being used, ie to suppose $0^0=0$, implies that you can reach $0$ by division of non-zero numbers, or that you can reach $0$ by division. Since the first is not accepted in maths, it implies that $0^0=0$ arises from division by zero.

  • 1
    Wait--what? You can't consistently define $0^0$. The problem comes when you consider $\lim_{x\rightarrow 0+}x^0$ versus $\lim_{x\rightarrow 0+}0^x$. The first limit is $1$ but the second limit is $0$. – MPW Aug 17 '14 at 01:26
  • 1
    Well, if you require that an exponential function $y^x$ for real numbers extend the usual notion of natural number powers $x^2,x^3,\dots,$ and satisfy the multiplication rule $y^{(x+z)}=y^x\times y^z$, then you must say $1^0=1$. And if you require it to be continuous for real arguments $x,y$ then you cannot define it for $x=y=0$. – Colin McLarty Aug 17 '14 at 01:38
  • @MPW demonstrates the effect of 0/0 in his answer. By taking the limit along the line y=0 for y^x, we see in effect 0/0, which you can prove anything really. But there are lots of other lines that pass through x=0, y=0, and following any of these universally point to 0^0=1. For more, you can show that a.0^0 = a, because there are zero numbers equal to zero on the LHS, and thus 0^0 is the identity element of multilication. Since there can only be one of these, 0^0=1. – wendy.krieger Aug 17 '14 at 02:31
  • 1
    @wendy.krieger if the argument is that $x^y \to 1$ along any "reasonable" path such that $x \to 0$ and $y \to 0$, I'd argue $x(t) = e^{-1/t}$ and $y(t) = t$ (shamelessly stolen from wikipedia) contradicts this. I don't think arguing for $0^0 = 1$ strictly from limiting behavior is really very appealing; combinatoric arguments seem more convincing but many would argue it is better to just give $0^0$ a context-dependent definition and in most contexts $0^0 = 1$ is appropriate. – guy Aug 17 '14 at 03:00
  • @guy 0^0 is not the same as 0/0, and you can easily prove 0^0=1 without resorting to division. The simple tactic of Let a=a. There are 0 values of b on the LHS, so write a.b^0 = a. We are free to set b to any value, so let b=0, ie a.0^0 = a. Unless you specifically acknowledge 0^0 = 1; then the equation can be written as 2a=a. & therefore allowing 0^0 to be anything other than 1, would completely upset all mathematics. It's just some folk have not mastered 0. – wendy.krieger Aug 17 '14 at 03:10
  • 2
    @wendy.krieger you are begging the question when you state that you can multiply north sides by $b^0$ for all $b$ and retain equality; this assumes that $b^0$ is a well defined number for all $b$, which is precisely the matter under dispute. If you are arguing that any reasonable definition of $x^y$ where $x$ and $y$ are reals, together with the axioms of the real field, logically imply $0^0 = 1$ then this simply isn't true. For example, the definition $x^y = \sup_{y > q \in \mathbb Q_+} x^q$ clearly doesn't apply, nor does $x^y = \exp(y \log x)$. – guy Aug 17 '14 at 15:15
  • @guy you are begging the question in supposing that 0 changes nature when you go from counting numbers to reals. The argument goes, that 0 is the result of a count, there are no values b on the LHS, and therefore we can set b to anything we want, and still get $b^0=1$. This is the empty bag. $a^1= £(a)$, $a^0 = £()$, and since the product operator can join bags by putting all the factors in the same bag, we have $£() = IE() = 1$. – wendy.krieger Aug 20 '14 at 08:43
  • @wendy.krieger You haven't given a formal definition of exponentiation, but depending on the actual definition one uses we can get different answers. For example, the definitions above for the reals don't apply to $0^0$. In the context of calculus, the argument is that the appropriate definition is to regard $0^0$ as undefined due to it's status as an indeterminate form - in context, this will lead you less astray. If, on the other hand, I define $a^b = |{f \mid f: b \to a}|$ where $a$ and $b$ are natural numbers with their set-theoretic definitions then $0^0 = 1$ by definition. – guy Aug 20 '14 at 18:18
  • @guy I have spent a good deal of tie with people who look for "formal definitions", especially in polytopes. Exponentiation has a formal definition in N, being $a^b = £*(b@a)$, which means b copies of a in product. When you extend a set, then you may have to rewrite definitions so that the originaol is preserved for the reduced set. When b=0, the product is an empty list, but because lists can be joined, an empty list is the IE of the product it is used under. It's pretty straight forward. Your proposition is that 0 in N is different to 0 in R, or your logic or defiition is faulty. – wendy.krieger Aug 21 '14 at 23:18
  • @guy Of your two dedfinitions, $x^y = \exp(y \log x)$ dose in fact imply $0^0=1$, since as $0=1/n$, and $n >> \log n$, these show that $(\log n )/ n => 0$ as n goes large. There is more than one infinity. The definition in "sup" makes an arbitary decision with the '>' in the subscript, which equates to 0/0. – wendy.krieger Aug 21 '14 at 23:26
  • A better argument for $0^0=1$ is $$\lim_{x\rightarrow 0+0} x^x=1$$ but not all mathematicians will accept this argumentation. It is also a definition consistent with many important theorems like the binomial theorem. – Peter Oct 18 '20 at 12:13
  • When you ask how do they intend to implement $\lim 0^x$ as x goes to 0, there is only two ways of approaching zero. One is division, and the other is to take roots. Taking roots implies $n^{-x}=0$ as x goes large. The other implies division by zero. But since the first is not accepted, the equation of $0^0=1$ supposes either division by 0 or some non-zero $a^b=0$. – wendy.krieger Oct 19 '20 at 14:09