8

I'm trying to find the easiest way to prove that $\lim_{h\to 0} \frac{e^h-1}{h}=1$. I found the following derivation, which seems the most straightforward: $$\lim_{h\to 0} \frac{e^h-1}{h} \overbrace{=}^* \lim_{h\to 0} \frac{((1+h)^\frac{1}{h})^h-1}{h}=\lim_{h\to 0}\frac{1+h-1}{h}=1$$ The step I'm curious about is the first one. I realize that $e=\lim_{h\to 0} (1+h)^\frac{1}{h}$, but what is it that fully justifies substituting this limit inside the outer limit? If I remember correctly sometimes making such substitutions inside limits can lead to false results?

Naively I would say that all I can say for sure is that $$\lim_{h\to 0} \frac{e^h-1}{h}=\lim_{h\to 0} \frac{(\lim_{k\to 0} (1+k)^\frac{1}{k})^h-1}{h}$$

Am I missing something trivial?

[Added: I define $e = \lim_{n\to\infty} (1+\frac{1}{n})^n$ and then $e^x$ is defined per the usual way of exponentiation of real numbers. Therefore the above limit is a stepping stone in proving that $(e^x)'= e^x$ and so we can't use this derivative (or its consequences, such as the Taylor series of $e^x$) in order to prove this limit, so that we avoid a cyclic argument.]

Snaw
  • 4,074
  • This is the definition for $(\frac{d}{dx} e^x)|_{x = 0}$. – Mark Saving Nov 28 '21 at 02:36
  • 1
    @MarkSaving Right, but in order to prove that $(e^x)'= e^x$ in the first place one has to first know how to calculate this limit in order to avoid a cyclic argument. – Snaw Nov 28 '21 at 02:39
  • What definition of $e^x$ are you using @Snaw ? Just the limit definition? – QC_QAOA Nov 28 '21 at 02:40
  • @QC_QAOA Yes, I define $e = \lim_{n\to\infty} (1+\frac{1}{n})^n$ and then $e^x$ is defined by the usual definition of exponentiation of real numbers. – Snaw Nov 28 '21 at 02:43
  • @Shaw That would have been important context to add to your question. The exact proof will depend greatly on your definition of $e^x$. – Mark Saving Nov 28 '21 at 02:44
  • @MarkSaving I agree, I edited the question to add this information. – Snaw Nov 28 '21 at 02:48
  • 1
    You can’t get from the obvious $h,k.$ double limit to just $h=k$ easily, as far as I can see. $$\lim_{h\to 0}\lim_{k\to0} f(h,k)\tag1$$ is not in general equal to $$\lim_{h\to0} f(h,h)\tag2.$$ Although if $(1)$ exists, then $(2)$ exists and is equal to $(1),$ it is generally harder to prove $(1)$ exists. – Thomas Andrews Nov 28 '21 at 02:54
  • 3
    Can you be more explicit as to what you mean by "usual way of exponentiation of real numbers"? This is a highly non-trivial definition. Very often one defines it using a series. If you define it the "low-tech" way as $e^x:=\sup{e^q,: q\in\Bbb{Q},, q<x}$, then proving things is going to be very difficult (which is why one often takes a slight detour into power series, prove things rapidly there, and then use uniqueness results to finish up) – peek-a-boo Nov 28 '21 at 02:59
  • @peek-a-boo Yes, after proving that every bounded monotonic sequence is convergent we defined $e^x$ essentially as the $\sup$ definition you wrote. Power series are only covered in the following semester – Snaw Nov 28 '21 at 03:07
  • I can use the following proof: take the log of the definition of $e$ to get $\lim_{h\to 0}\frac{\log(1+h)}{h}=1$ and then by substituting $\log(1+h)=t$ we get the required limit. But I feel these two steps together can seem pretty unmotivated. This is why I'm looking to see if there is any simple way to salvage the simple argument above – Snaw Nov 28 '21 at 03:11
  • @peek-a-boo To clarify what I mean by "essentially the $\sup$ definition you wrote", we take an increasing sequence of rational numbers $q_n \to x$, and then $e^{q_n}$ is an increasing and bounded from above, so we define its limit to be $e^x$, from which we prove all the usual properties of exponentiation. – Snaw Nov 28 '21 at 03:17
  • I am confused by the question. Why isn't L'Hopital's Rule used instead. – user2661923 Nov 28 '21 at 03:18
  • @user2661923 It would rely on knowing the derivative of $e^x$, which would lead to a cyclic argument. – Snaw Nov 28 '21 at 03:19
  • @Snaw Nice rebuttal. – user2661923 Nov 28 '21 at 03:20
  • 1
    @user2661923 L'Hopital's rule isn't being used because OP wants to avoid circularity (and second, which is my opinion: L'Hopital's rule for such elementary limits is completely overkill and useless. You can't get the right answer unless you already assume what you're trying to prove. Same story with $\sin x/x$. Both are just asking about derivatives at the origin). To OP: I highly doubt this "proof" can be salvaged elementarily. Sure, the results are correct, because both limits are indeed equal to $1$, but this particular approach is highly dubious. I'd love to be proven wrong though. – peek-a-boo Nov 28 '21 at 03:21
  • 2
    Out of general interest: this is the "Solution 4. Perhaps the most pesky case" in this old answer. – Jakob Streipel Nov 28 '21 at 03:27
  • You cannot escape circularity. Circularity appears in every language. Mathematics is such a language, though it's formal it is still circular. What prets suggested depends on definition of an integral of 1/t, which then again depends on knowing the derivative of log, which depends on the derivative of exponential. – MathematicalPhysicist Nov 28 '21 at 04:13
  • @ThomasAndrews Actually it would still not be enough, $\lim_{h\to 0}\lim_{k\to 0} \frac{hk}{h^2+k^2}=0$ (and also with the order reversed), but along $h=k$ we get $\frac{1}{2}$ – Snaw Nov 28 '21 at 05:12
  • @MathematicalPhysicist But not in Mathematics. Definitions are non-ambiguous (which can't be done in any other language) and then proofs are carefully constructed without circularity. For instance see in the comment above how to prove this limit that I am looking for without circularity. In the same way it is possible to do things using the definition of the integral that you are referring to in a non-circular way. But I was looking for a more intuitive way of proving this limit, which unfortunately I guess cannot be done. – Snaw Nov 28 '21 at 05:22
  • @QC_QAOA Thanks. The answers there (given my definitions) suggest the proof I wrote in a comment above, to take the log of the definition of $e$ and then to make a substitution. I was hoping to avoid these two steps as they seem unmotivated.

    (Strictly speaking if my question is "is this step correct?" then the answers I was given here are clear: it's indeed incorrect, and it's unlikely that it's possible to salvage the "proof" by an elementary argument. The wider interpretation of the question, "is there a more motivated proof than the standard log/substitution one", remains unanswered.)

    – Snaw Nov 28 '21 at 14:42

1 Answers1

2

This is indeed very problemlatic. For example, $\lim_{h\rightarrow 0}\frac{0}{h}\not=\lim_{h\rightarrow 0}\frac{h}{h}$ even though $\lim_{h\rightarrow 0} = 0$. Roughly, this is related to how fast the function converges to its limit, not just what the limit is. In particular, in your question, this boils down to whether $((1+h)^{1/h})^h-1$ converges to $0$ as quickly as $e^h-1$. While it is the case, it needs justification.

Just a user
  • 14,899
  • Could you expand regarding speed of convergence? (I realize this approach is unlikely going to help me to find the elementary proof I was looking for, but I'm just curious.) – Snaw Nov 28 '21 at 03:27
  • E.g. $h$ converges to $0$ much much slower than $h^2$ converges to $0$, $\sin(h)$ converges to $0$ as fast as $h$ for $\frac{\sin(h)}{h}\rightarrow 1$. This is the idea of "equivalent infinitesimals". In your example, essentially because $(1+h)^{1/h}$ converges to $e$ much faster than $(e^h-1)/h$ converges to $1$, hence the replacement doesn't hurt, but this is not at all easy to justify. – Just a user Nov 28 '21 at 03:34
  • I see. Thank you. – Snaw Nov 28 '21 at 03:36