5

In modern treatments of calculus, limits are used to motivate the derivation of differential calculus. However, when I searched for the history of limits I came across this Wikipedia article, which says:

...the modern idea of the limit of a function goes back to Bolzano who, in 1817...

If the modern idea of limits is dated to 1817, what kind of limits did Newton and Leibniz use if any?

Paul Frost
  • 76,394
  • 12
  • 43
  • 125
Connor
  • 647

1 Answers1

5

Newton and Leibniz initially expressed the derivative $f'(x)$ as the ratio $\frac{df}{dx}$, where the rates of change $df$ and $dx$ are infinitesimal numbers. Analogously, the integral was defined as the sum of the ordinates for infinitesimal intervals in the abscissa. Leibniz fully embraced the use of infinitesimals, while Newton started avoiding them in his later life by forming calculations based on ratios of changes, the so-called fluxions.

Also, have a look at the Wikipedia article for calculus and the Wikipedia article for the history of calculus.

In short, calculus was introduced in terms of "infinitely small" numbers and their ratios, not in terms of modern limits. In order to understand infinitesimals better, have a look at this post: https://math.stackexchange.com/a/21209/998803 .

  • I see, I have heard of calculus of infinitesimals before, are there benefits to this method? How did the method used by Newton and Leibniz to define differentiation differ from the one commonly taught now that uses limits? – Connor Jul 28 '22 at 12:53
  • 1
    @Connor It is possible but really hard to implement infinitesimals in a formal way in standard set theory, since they cannot exist in the usual setting of real numbers. Because the real numbers satisfy an important property, called the Archimedean Property: given any positive real number $ϵ>0$, no matter how small, and given any positive real number $M>0$, no matter how big, there exists a natural number $n$ such that $nϵ>M$. But an "infinitesimal" $ξ$ is supposed to be so small that no matter how many times you add it to itself, it never gets to $1$, contradicting the Archimedean Property. – Andreas Tsevas Jul 28 '22 at 13:04
  • 2
    @Connor I would say that the main benefit of infinitesimals is that they may be more intuitive for a beginner, because then $df/dx$ is just a fraction that can be manipulated accordingly and there is no need to learn what a limit is. In most other regards, there are good reasons that the description as a limit "won" in formal mathematics. (Also, limits are a good stepping stone to understanding convergence in topological spaces and other more advanced concepts. Infinitesimals and the notation $df/dx$ just stick around for historical reasons.) – Andreas Tsevas Jul 28 '22 at 13:08
  • 2
    @Connor: One thing to keep in mind: our modern concept itself of the real numbers was not even formalized in the 1600's, let alone the concept of infinitesimal. Very roughly speaking, our modern concept of real numbers (including a rigorous theory of limits) did not arise until the 1800's, as a result of attempts to fix inconsistencies which had crept into calculus. And only in the 1900's did the (deeper, more difficult) modern rigorous concept of infinitesmals emerge. – Lee Mosher Jul 28 '22 at 19:59