4

Why is the use of infinities generally accepted, but not the use of infinitesimals?

I know a little about non-standard analysis, but that seems to be the exception? From what I've seen, usually infinitesimals are avoided elsewhere (and are replaced by the notion of a limit, which again is based on infinities) even when they are presented rigorously and clearly.

Is there a valid reason for this (beyond simple preference)?

Stephen
  • 3,682
  • It is very easy to see that if you claim to have a largest number, I can add one to it to show that it cannot be. It is very difficult to imagine a smallest positive number for the a similar reason: I can take half. – John Douma Sep 10 '20 at 16:42
  • Infinitesimals make ordered fields non-Archimedian, which is a vital property of the real numbers for calculating limits. – CyclotomicField Sep 10 '20 at 16:46
  • 1
    In real analysis, infinity comes up in the sense of things tending to infinity, which is just the convenient shortcut for describing that they grow arbitrarily large. On the other hand, when something grows arbitrarily small (in the sense of absolute value), that means it tends to $0$, precisely because the real numbers are Archimedian. So the question really is what the "use of infinitesimals" would be, since there doesn't seem to be a natural thing in real analysis that could be described as infinitesimal and if you start to axiomatize infinitesimals, you end up doing non-standard analysis. – Thorgott Sep 10 '20 at 17:44
  • @Thorgott I see, so there is no useful way of rigorously using infinitesimals outside of non-standard analysis? – Stephen Sep 10 '20 at 18:04
  • Because "adding" $\pm\infty$ to ${\mathbb R}$ deals with two natural special "processes", while inventing "infinitesimals" adds more artificial elements to ${\mathbb R}$ than there are atoms in the universe, just to avoid $\epsilon/\delta$-thinking. – Christian Blatter Sep 10 '20 at 18:11
  • I don't feel qualified to make such an absolute claim, but I certainly don't know of any way and the most natural interpretation as above just reduces to a trivial case. I'm rather trying to say that the "use of infinitesimals" isn't "not accepted", but that there just hardly seems to be any (in real analysis, that is). – Thorgott Sep 10 '20 at 18:20
  • 2
    Infinities appear in much more basic parts of mathematics (set theory) as the size of certain sets. It is really difficult to avoid. Infinitesimals are entirely possible to avoid. Maybe this difference in necessity that makes people more comfortable dealing with infinities than infinitesimals? – Arthur Sep 10 '20 at 19:10
  • 1
    I'm not sure I understand why infinitesimals are avoided. Differentials are used in calculus all the time. You can rigorously define how infinitesimals work, it's just that people abuse them in contexts where rigor fails. – Alex R. Sep 10 '20 at 20:43
  • Actually "infinity" is not defined in real analysis. "Infinite set" is defined, certainly. But things like $\lim_{x \to \infty} f(x) = L$ do not use a previously defined notion of infinity. It's just notation. – zhw. Sep 10 '20 at 22:25
  • @zhw. How do you use the concept of infinite sets without the concept of infinity? – Stephen Sep 10 '20 at 23:10
  • @AlexR. Hum, the way I learnt calculus was with the idea that differentials were "sufficiently small" quantities, but not "infinitesimally" so. – Stephen Sep 10 '20 at 23:13
  • @zhw. I don't think it's just notation, since at least one definition of this expression involves infinite sequences. – Stephen Sep 10 '20 at 23:15
  • Thanks for these comments, I think I'm starting to understand why. – Stephen Sep 10 '20 at 23:24
  • You can work with the definition $x\to \infty$ without defining $\infty$ as a stand alone definition. – zhw. Sep 11 '20 at 00:51
  • @zhw. That's beside the point, though. If you're using notions such as infinite sets, then you've at least accepted the idea of infinite, I'd say (without necessarily having a formal definition of it). This is not the case of infinitesimals, which was, at least in my undergraduate and postgraduate studies, avoided, and only mentioned when ridiculizing the idea. – Stephen Sep 11 '20 at 07:04
  • Accepting infinite sets is a far cry from accepting infinity as a numerical concept. We go to great pains in Calculus to explain to students that "infinity arithmetic" is fraught with dangers like $\frac{\infty}{\infty}$ or $\infty-\infty$. So I would not say that the use of infinities is generally accepted in real analysis. – Lee Mosher Sep 11 '20 at 13:53

1 Answers1

4

This is a tough question that's hard to answer definitively because of the different "infinities" (for an overview, see Understanding infinity), the history and popularity of different branches of math involved, etc.

I think a large portion of the reasons come down to the fact that there are more contexts in which "infinities" would be useful than "infinitesimals". This has knock-on effects for, say, how math curricula are designed in universities, the level of general awareness of mathematicians which affects their ability to spread ideas, etc.


Infinitesimals don't arise in common contexts

Ordinals were discovered when Cantor was working on real analysis, and cardinals (especially the countable-uncountable distinction) are often useful when dealing with infinite sets, both in and outside of analysis. And $\pm\infty$ in the extended reals help to give a tidy account of limits and measure. And if we broaden out view to complex analysis, the Riemann sphere is fundamental and has a point labeled $\infty$. But none of these contexts directly lend themselves to an infinitesimal.

For ordinals and cardinals, we don't even have something positive but less than $1$. And for the others, an infinitesimal would break the (Dedekind) completeness property of the reals that is critical for usual analysis to work.

and their use is limited

Now, you can change the arithmetic on the ordinals to get the surreal numbers, or look at other non-archimedean fields, perhaps in a more general/abstract way. But these are not often useful for analysis purposes. In Combinatorial Game Theory, there are infinitesimals like "up" that don't reside in a field, but that's a pretty niche area/application.

except maybe in nonstandard analysis

Arguably the most useful example of infinitesimals would be in Robinson's hyperreals for nonstandard analysis. In the scheme of things, this is relatively new in Calculus (so it's unfamiliar to many teachers and students would still have to learn standard approaches to connect with other material), and doesn't give you any new theorems about analysis, so it's tough to introduce into a curriculum. It's also arguably harder to make fully formal than a traditional construction of the reals.

That said, some mathematicians are using nonstandard analysis in their arguments. For example, Terry Tao has a number of blog posts about it.

Mark S.
  • 23,925
  • Much appreciated! This is the kind of explanation I was hoping to get (but didn't expect to). I'll take the time to go through your recommendations - they seem quite interesting. – Stephen Sep 11 '20 at 14:43
  • 1
    @Stephen One thing I didn't think to mention in my answer is smooth infinitesimal analysis which uses a different concept of "infinitesimal" than "positive but less than 1/n for any positive integer n". But it is based on intuitionistic logic and has theorems that can be phrased in ways that look wrong to people used to more conventional analysis, so it's also pretty niche. – Mark S. Sep 19 '20 at 13:04