0

Background: Let me cite a part from the book Zero: a dangerous idea (as some don't appear to get why I intently divide by zero).

"Dividing by zero once—just one time—allows you to prove, mathematically, anything at all in the universe. You can prove that 1 + 1 = 42, and from there you can prove that J. Edgar Hoover was a space alien, that William Shakespeare came from Uzbekistan, or even that the sky is polka-dotted. [...] Multiplying by zero collapses the number line. But dividing by zero destroys the entire framework of mathematics."


Proof

Let $a$ and $b$ equal $1$. Then

$$b^2=ab \tag{1}$$

As $a=a$,

$$a^2=a^2 \tag{2}$$

$(2)-(1)$ $\rightarrow$

$$a^2-b^2=a^2-ab \tag{3}$$

Simplify $(3)$,

$$(a-b)(a+b)=a(a-b) \tag{4}$$

Divide by $(a-b)$ and simplify,

$$b=0\tag{5}$$

As $b=1$,

$$0=1\tag{6}$$

Then suppose $0=1 \rightarrow A \land \lnot A$

$$\ldots$$

$$A\equiv B$$

And from here you can prove any statement you want.

("$\ldots$" is signifying missing steps).


I don't know how to logically prove that $0=1 \rightarrow A \land \lnot A$ (the missing steps that "$\ldots$" represents) to get that $A\equiv B$. I know too little math to do such a proof. Can I get help with doing this part of the proof?

Andreas
  • 1,948
  • 2
  • 16
  • 45

2 Answers2

2

What the sentence you quoted is saying is two things:

  1. If we allow division by zero then we can always reach a contradiction and the system becomes inconsistent.

  2. From a contradiction we can prove anything.

A formal system is a set of axioms and rules (of inference) over a language. A proof of $A$ from $\Gamma$ in a formal system (where $A$ is a formula -i.e. sentence- and $\Gamma$ is a set of formulas) is a sequence where the last element of the sequence is $A$ and each element of the sequence is either in $\Gamma$ or an axiom of the system or deduced by rules from previous members of the sequence (see an example of a proof at the end). If it is possible to prove $A$ from $\Gamma$ in the formal system $L$ we write $\Gamma\vdash_L A$ (subscript is dropped when there is no confusion). So we can formalize the second sentence as $A\land \neg A\vdash B$. Note that your formalization of the problem, (i.e. getting from $A\land \neg A$ to $A\equiv B$) is not correct. When you say "Dracula is a flying pig" being a pig and being able to fly are properties that describe Dracula (whether true or not). In First-order logic terms, it can be formalized as $Flies(Dracula)\land IsaPig(Dracula)$.

How about the first sentence? It is saying that if we add "division by zero rule" to our set of rules (let's call the new logic $L_0$) then we can reach a contradiction i.e. $\vdash_{L_0} A\land \neg A$. Of course for this rule to be applicable, our formal system must be a superset of a system of numbers (i.e. it must include axioms and rules for manipulating numbers, etc.).

So the question is this: If we have a formal system that includes axioms and rules governing numbers, and we add the rule of "division by zero" to our set of rules of inference, can we prove every sentence in the language? The answer is yes if our formal system has commonsense axioms and rules (like modus ponens, substitution with equivalent formulas, ...). As you showed above, using the rule of division by zero we can reach a contradiction. Now I show how to prove every sentence (in the language of the formal system) from a contradiction (i.e. $A\land \neg A\vdash B$):

\begin{align} &1)\ \neg A \qquad\qquad \text{premise}\\ &2)\ \neg A\lor B \quad\ \ \ \ \text{apply}\lor \text{rule on 1}\\ &3)\ A\rightarrow B \qquad \ \text{replace 2 by equivalent formula}\\ &4)\ A \qquad\qquad \ \ \text{premise} \\ &5)\ B \qquad\qquad \ \ \text{apply } modus\ ponens\text{ rule on 3, 4 } \end{align}

This was a purely syntactic approach. A semantic approach is possible as well.

LoMaPh
  • 1,361
  • (For readers of this answer: I edited my question; this explains why "Dracula" is part of this answer). – Andreas Nov 12 '17 at 07:37
1

Dividing by zero once—just one time—allows you to prove, mathematically, anything at all in the universe.

That is very wrong. It is perfectly consistent to assign a value to division by zero, your microprocessor does it and the universe is still going.

What is problematic is to remove that $x \ne 0$ condition from statements involving division, such as:

$$x \ne 0 \implies (ax = b \iff a = b/x)$$

But otherwise you can divide by zero all day long. You just have to realize you can't infer the things you would normally want to infer from such expressions.

DanielV
  • 23,556