First off, I won't claim to be a math expert: it was debatably my least favorite subject in school, and Pre-Calculus was where I reached the limit of my mathematical capabilities. However, I recently saw Alan Becker's new video Animation vs. Math which helped me visualize some of the concepts I struggled with in school. One novel way the video helped me conceptualize division was as simply another means of doing subtraction. You take the dividend and the divisor, then subtract the divisor from the dividend repeatedly until you reach zero. The number of times you subtracted the divisor is the quotient. So, if we follow this logic, why does dividing by zero not result in an infinitely large number, since you would subtract zero infinitely many times?
-
1It is sometimes possible to think of division as repeated subtraction (in the same way that it is sometimes possible to think of multiplication as repeated addition), but neither multiplication nor division is actually defined in that way (at least, not typically). https://www.maa.org/external_archive/devlin/devlin_06_08.html – Xander Henderson Jul 01 '23 at 01:14
-
2Usually, we define division as the "inverse" of multiplication. That is, when we write $a \div b = c$, what we are actually saying is that $c$ is the magical number which fills in the blank $a = b \times \underline{\qquad}$. If $b = 0$, there is no real number which fills in this blank. – Xander Henderson Jul 01 '23 at 01:16
-
3I think this is actually an argument for the answer not being infinity. No matter how many times you subtract 0, finite or infinite, the number should remain the same. – Nick Alger Jul 01 '23 at 01:23
-
Related: https://math.stackexchange.com/q/125186/42969 – Martin R Jul 01 '23 at 01:23
-
1Regarding arithmetic involving "infinity" I find it important to mention that we do things the way we do for convenience; when you say "why does dividing by zero not result in an infinitely large number" what do you actually mean by an "infinitely large number"? There does not exist an infinitely large real number. You can try to introduce one (and that's not completely outlandish) but it causes a real headache, as pointed out in the recent answer. It is simply more inconvenient to have division by zero than to not have it. – Charlie Jul 01 '23 at 01:27
-
You have to specify what you mean subtracting an infinite number of times. If you use some limiting process you'd never change the dividend and even in the limit you would never reach zero. – Robearz Jul 01 '23 at 02:01
1 Answers
Writing $\frac{1}{0} = \infty$ does make a certain kind of sense, and there are contexts in mathematics where we adopt this convention. Still, this definition can be problematic.
For example, $0 = 1-1$, so $\frac{0}{0} = \frac{1}{0}-\frac{1}{0}$ using ordinary rules of arithmetic. Based on your intuition for how division works, I'm guessing you'll say that $\frac{0}{0} = 0$, so we end up concluding that $0 = \infty - \infty$. But now, let's consider $1 = 2-1$. Diving again by zero, we get $\frac{1}{0} = \frac{2}{0}-\frac{1}{0}$, or $\infty = \infty - \infty$. Now we're forced to conclude that $0 = \infty - \infty = \infty$, so $0$ and $\infty$ are the same. This is clearly absurd.
Another thing you can think about is how to define $\frac{1}{0} \times 0$. Do the zeros cancel out and we get $1$? What about $\frac{2}{0} \times 0$? Is this $2$? Does this mean that $1 = \infty \times 0 = 2$?
These examples pretty much give the intuition as to why dividing by $0$ isn't straightforward. Whatever convention you adopt, combined with the simple rules of arithmetic, forces you into all kinds of absurdities.

- 30,807
-
Therefore we should just forbid to divide by zero and not claim that there are artificial workarounds that make it somewhat meaningful. Those workarounds do not actually solve the issues we have. Even if we consider $1/0$ as the limit of $1/x$ for $x\to 0$ , we run into a problem since the limit from the left and from the right do not coincide. – Peter Jul 01 '23 at 08:07