2

I would like to know how to prove that $$\frac{d}{dx}e^x=e^x$$ An argument I've seen used is that this is the defnition of the number $e$; we can say that for some number, let's call it $a$, the derivative of $a^x$ is $a^x$ and we've just called this number $e$. But a problem I have with this approach is that we are assuming that there exists a number $a$ that satisfies this property; how do we know it exists in the first place?

I've also seen an approach used that I think is very circular in its reasoning; they use the Maclaurin expansion of $e^x$ to prove by first principles that $\frac{d}{dx}e^x=e^x$. But the derivation of this series makes use of the very fact that $\frac{d}{dx}e^x=e^x$.

Any other proof will be very appreciated; an explanation of the $2$ above proofs would also be very useful.

Thank you for your help.

  • 6
    The answer to the question like this depends on how exactly you define $e^x$ (or $a^x$ more generally) – Wojowu Dec 24 '20 at 17:01
  • 2
    This is a classic case of a concept having multiple equivalent definitions: you pick one, and prove all of the rest. These equivalent definitions include: $e^x$ is a function under certain nice conditions that satisfies $\frac d{dx}e^x=e^x$. $e^x$ is the Maclaurin series. $e^x$ is the inverse of $\log$. etc – Rushabh Mehta Dec 24 '20 at 17:01
  • 1
    You first need to decide how you want to define $e^x$. How you prove that its derivative is $e^x$ depends on how you define $e^x$. Without deciding how to define it, it is impossible to provide a proof. You can start with $\log(x)$ defined in terms of an integral and define the exponential as its inverse; or you can start with the power series definition; or you can start with a functional equation. But you first must decide how you want to start. – Arturo Magidin Dec 24 '20 at 17:02
  • You can use the series of $e^x$. – callculus42 Dec 24 '20 at 17:12
  • By the way, it is not necessarily circular to prove that the exponential function is its own derivative using its Taylor series. If you use the Taylor series as the definition of $e^x$, then there is no need to derive it using the derivative of $e^x$. This approach might seem unsatisfying for you though, since you might not see the motivation for why we would do this. However, it is an important series that becomes even more important in complex analysis, where it is typically the definition of $\exp(z)$. – Joe Dec 24 '20 at 17:16
  • Related: https://math.stackexchange.com/questions/1558734/what-are-different-approaches-to-introduce-the-elementary-functions/ – Ethan Bolker Dec 24 '20 at 17:35
  • Actually you can deduce the series expansion for $e^x$ without using derivatives... – Allawonder Dec 24 '20 at 19:38
  • @Allawonder interesting, how? – A-Level Student Dec 24 '20 at 22:30
  • See this answer for another approach: https://math.stackexchange.com/a/541330/72031 – Paramanand Singh Dec 25 '20 at 02:29
  • As to comment by user @Allawonder you just have to note that both $A=(1+(x/n))^n$ and $B=(1-(x/n)) ^{-n} $ tend to $e^x$ and for $x>0$ the value $C=\sum_{k=0}^{n}x^k/k!$ lies between $A, B$ and hence also tends to $e^x$. For negative values you can use functional equation $f(x+y) =f(x) f(y) $ which is satisfied both by $e^x$ and the series. – Paramanand Singh Dec 25 '20 at 02:31
  • @A-LevelStudent Consider the function $a^x,$ with $a$ being some constant. Assuming continuity when $x=0,$ it follows that $a^x=1+kx$ for $x\to 0.$ For some positive integer $n,$ we then have that $$a^{nx}=(1+kx)^n=\sum_{i=0}^{n}{\binom n i}(kx)^i.$$ Now let $nx=z,$ so that the equation becomes $$a^z=\sum \frac{1}{i!}\prod\left(1-\frac jn\right)(kz)^i,0< j<i.$$ If you now let $n\to\infty,$ then $z$ becomes any quantity, and the product in each term goes to $1,$ so that the equation becomes $$a^z=\sum_{i=0}^{\infty} \frac{1}{i!} (kz)^i.$$ Thus by setting $z=k=1,$ we obtain the number $e,$ so... – Allawonder Dec 25 '20 at 16:38
  • ...that for $k=1$ we obtain $$e^z=\sum \frac{1}{i!} z^i,$$ which is the desired series expansion. To see the relationship between $a$ and $k$ in more familiar terms, set $z=1$ in the expansion for $a^z$ to obtain that $$a=\sum \frac{1}{i!}k^i=e^k,$$ which shows that $k=\log a.$ – Allawonder Dec 25 '20 at 16:39

9 Answers9

5

Define $e$ as $$e=\lim_{n\to\infty}\left(1+\frac1n\right)^n.$$ Thus $$e^x=\lim_{n\to\infty}\left(1+\frac1n\right)^{xn}.$$ Set $m=nx$ and note that $n\to\infty$ implies that $m\to\infty$, assuming $x$ is real. Then $$e^x=\lim_{m\to\infty}\left(1+\frac1{m/x}\right)^m=\lim_{m\to\infty}\left(1+\frac{x}{m}\right)^m.$$ Then $$\frac{d}{dx}e^x=\lim_{m\to\infty}\frac{d}{dx}\left(1+\frac{x}{m}\right)^m=\lim_{m\to\infty}\left(1+\frac{x}{m}\right)^{m-1}.$$ This is $$\frac{d}{dx}e^x=\lim_{m\to\infty}\frac{\left(1+\frac{x}{m}\right)^m}{1+x/m}=\frac{\lim_{m\to\infty}\left(1+\frac{x}{m}\right)^m}{\lim_{m\to\infty}(1+x/m)}=\frac{e^x}{1}=e^x.$$


A verification that $\frac{d}{dx}\lim=\lim\frac{d}{dx}$.

Let $C^1=C^1(\Bbb R)$ be the space of continuously differentiable functions $\Bbb R\to\Bbb R$. Similarly, let $C^0=C^0(\Bbb R)$ be the space of continuous functions $\Bbb R\to\Bbb R$. Then the differential operator map $D:C^1\to C^0$ given by $\phi(x)\mapsto \phi'(x)$ is a linear transformation.

To show this, note that for all $f,g\in C^1$ and $\alpha\in\Bbb R$, $$\begin{align} D(f+g)&=D(f)+D(g)\\ D(\alpha f)&=\alpha D(f). \end{align}$$ These are easy to check.

Since $D$ is a linear transformation it is continuous. That is, every convergent sequence of functions in $C^1$ is mapped by $D$ to a convergent sequence of functions in $C^0$: $$D\left(\lim_{n\to\infty}f_n\right)=\lim_{n\to\infty}D(f_n),$$ for a given convergent sequence of functions $(f_n)\subset C^1$. Clearly, $f_n(x)=(1+x/n)^n$ converges to $e^x$, so the above applies.

clathratus
  • 17,161
4

One nice approach is to define the function $\ln x$ (for $x\gt0$) as the integral

$$\ln x=\int_1^x{dt\over t}$$

and then show that this function has the property of a logarithm, namely $\ln(ab)=\ln a+\ln b$, after which one can define the function $\exp(x)$ to be the inverse of $\ln x$ and show it has the properties of an exponential, namely $\exp(x+y)=\exp(x)\cdot\exp(y)$. The number $e$ is defined as the (unique) value for which $\ln x=1$.

Barry Cipra
  • 79,832
  • Are you using the fact that only logarithms have the property that $f(ab)=f(a)+f(b)$? – A-Level Student Dec 24 '20 at 17:05
  • 1
    @A-LevelStudent: Logarithms don’t have that property for all $a,b$ (just positive ones); and they are not the only ones that do (the zero function does); and what is being used is that the function has that property, not that it is the only one that does. – Arturo Magidin Dec 24 '20 at 17:07
  • We hardly need to show that the integral has the properties of the logarithm or that its inverse has the properties of an exponential to define $e$ as the $x$ for which $\int_1^x (dt/t) = 1.$ – md2perpe Dec 24 '20 at 17:25
  • @md2perpe, true, you can define the number $e$ without any of that. But you need those things in order to show that the derivative of $e^x$ is $e^x$. – Barry Cipra Dec 24 '20 at 17:28
  • I like this approach because it highlights the geometric interpretation of $e$. – Joe Dec 24 '20 at 17:39
  • @BarryCipra. Define $\ln$ as above and set $\exp=\ln^{-1}$ so that $\ln(\exp(x)) = x.$ Taking derivative and using the chain rule gives $\exp'(x)\cdot 1/\exp(x)=1,$ i.e. $\exp'=\exp.$ However, to show that $\exp$ really gives a power, precisely $a^x = \exp(x\ln(a)),$ for $a>0,$ the properties of powers are needed. – md2perpe Dec 24 '20 at 18:26
4

You can approach this from the limit definition of the derivative

$$ f'(x) = \lim_{h \to 0} \dfrac{f(x + h) - f(x)}{h}$$

With $f(x) = e^x$ this becomes

$$f'(x) = e^x \cdot \lim_{h \to 0} \dfrac{e^h - 1}{h}$$

and you can use your method of choice to show that

$$\lim_{h \to 0} \dfrac{e^h - 1}{h} = 1$$

To do this we'll use the limit definition of $e$: $$\displaystyle e = \lim_{n \to \infty} (1 + 1/n)^n$$

or equivalently

$$e = \lim_{x \to 0} (1 + x)^{1/x}$$

Introduce a change of variables, $t = e^h - 1$. Rewrite this as $\ln(t + 1) = h$.

$$\begin{align} \lim_{h \to 0} \dfrac{e^h - 1}{h} &= \lim_{t \to 0} \dfrac{t}{\ln(t+1)} \\ &=\lim_{t \to 0} \dfrac{1}{\frac{1}{t}\ln(t+1)} \\ &= \lim_{t \to 0} \dfrac{1}{\ln(t+1)^{1/t}} \\ &= \dfrac{1}{\ln(e)} \\ &= 1 \end{align}$$

BobaFret
  • 2,863
3

One way to do this without making any assumptions is by starting with the definition of the natural logarithm, and working from there. This approach is outlined by Barry Cipra (+1). I posted an answer to How can we come up with the definition of natural logarithm?, which I will reproduce here. Although it seems unnatural to start with the logarithm and use it to define exponents, this approach is easy to make rigorous and can also be made fairly intuitive. As I mention in the post, the reason we might do choose to do this is because it is fairly difficult to define something like $2^\pi$. It's clear what $2^x$ would mean for positive integers, even for rational numbers, but not for irrational numbers. Starting with the logarithm, we can avoid this issue.


The study of the exponential function and the logarithm is motivated by the desire to find a way of modelling a particular kind of growth, where the rate of change of a quantity is proportional to the value of that quantity at any given time. Consider how the population of a colony of bacteria might change over time—perhaps it doubles every hour. Or how your wealth compounds when you put it into a bank. In both cases, we are dealing with exponential growth. Later, when Calculus makes rigorous the notion of 'instantaneous change', we may define an exponential function as a solution to the differential equation $$ \frac{dy}{dx}=ky \, . $$ Solving this equation, we find that $$ kx+C = \int \frac{1}{y} \, dy \, . \tag{*}\label{*} $$ It seems like we have hit a roadblock, since we cannot integrate $1/y$ using the rule $$ \int y^n \, dy = \frac{y^{n+1}}{n+1}+C \, , $$ since that would lead to division by zero. However, the fundamental theorem of calculus tells us that the set of antiderivatives of $1/y$ must be $$ \int_{a}^{y} \frac{1}{t} \, dt \, . $$ This is a set of functions that differ by a constant as we vary the value of $a$. However, since there is already a constant on the LHS of $\eqref{*}$, we can assume that $a=1$ without loss of generality. This yields $$ kx+C = \int_{1}^{y} \frac{1}{t} \, dt \, . $$ Let $f(y)=\int_{1}^{y} \frac{1}{t} \, dt$. This function represents the area of the region bounded by the hyperbola $1/t$, the horizontal axis, and the vertical lines $t=1$ and $t=y$:

Geometric interpretation of f

As the value of $y$ gets larger, the area of this region also gets larger. We can then verify that $f$ is strictly increasing, meaning that $f^{-1}$ exists. Hence, $$ kx+C = f(y) \implies y = f^{-1}(kx+C) \, . $$ Since growth of this kind is called exponential growth, it only seems right to bestow upon the function $f^{-1}$ the new name $\exp$. The statement reads more nicely as $$ y=\exp(kx+C) \, . $$ Of all of the types of exponential growth, the most pleasing is that in which the rate of change of a quantity is equal to the value of that quantity (not just proportional to it). Since the set of solutions to $dy/dx = ky$ is $y=\exp(kx+C)$, the set of solutions to $dy/dx = y$ is $$ y=\exp(x+C) \, . $$ And of these solutions, you will no doubt agree that the simplest and most aesthetic is $$ y=\exp(x) \, , $$ where $\exp(x)$ is defined as the inverse of $f(x)=\int_{1}^{x} \frac{1}{t} \, dt$. For obvious reasons, this function is called the natural exponentional function. There seems to be clear parallels between $\exp$ and functions of the form $g(x)=a^x$. If we return to our examples of exponential modelling—the colony of bacteria that doubles in size every hour—it certainly seems that $2^x$ is an exponential function. After all, the growth of a bacterial colony is proportional to its population at any given point in time. This suggests that $$ 2^x = \exp(kx+C) $$ for some values of $k$ and $C$. (So far, we haven't actually defined what $2^x$ means when $x$ equals $\sqrt{2}$, say, but we'll leave to the side for the moment, and use an 'intuitive' definition of irrational exponents, where $2^{\sqrt{2}} \approx 2^{1.41421}$.) Suspecting that $a^x$ and $\exp(x)$ are somehow linked, we try to recall the essential properties of $a^x$, hoping that $\exp(x)$ shares these properties. The most important of these properties is that $$ a^x \cdot a^y = a^{x+y} \, . $$ And it turns out that $\exp(x)$ also has this property! In other words, $$ \exp(p) \cdot \exp(q) = \exp(p+q) \, . $$ However, proving this takes a little work. Since $\exp(x)$ is defined as the inverse of $f(x)=\int_{1}^{x}\frac{1}{t} \, dt$, it seems sensible to use this integral in our proof. Showing that $\exp(p) \cdot \exp(q) = \exp(p+q)$ is actually equivalent to showing that $f(r) + f(s) = f(rs)$, since \begin{align} &f(r) + f(s) = f(rs) \\ \iff &f^{-1}(f(r)+f(s)) = rs \\ \iff &f^{-1}(f(r)+f(s)) = f^{-1}(f(r)) \cdot f^{-1}(f(s)) \\ \iff &\exp(p+q) = \exp(p) \cdot \exp(q) \end{align} (And this should make intuitive sense, too. Notice how $f$ changes multiplication into addition, and so it is only natural that $f^{-1}$ changes addition into multiplication.) There is also a slick way to prove that $f(r)+f(s)=f(rs)$: $$ \int_{1}^{rs}\frac{1}{t} \, dt = \int_{1}^{r}\frac{1}{t} \, dt + \int_{r}^{rs}\frac{1}{t} \, dt $$ Let $z=t/r$, meaning that $dz=\frac{1}{r}dt=\frac{z}{t}dt$. Then we have \begin{align} \int_{1}^{rs}\frac{1}{t} \, dt &= \int_{1}^{r}\frac{1}{t} \, dt + \int_{1}^{s}\frac{1}{z} \, dz \\ f(rs) &= f(r) + f(s) \, . \end{align} Now that we have established that $\exp(p) \cdot \exp(q)=\exp(p+q)$, it seems reasonable to conjecture that $\exp(x)$ is actually equal to $a^x$ for some base $a$. To find that base, we plug in $x=1$, yielding $\exp(1)$, which we will abbreviate as $e$. This means that $$ e^x = \exp(x) \, . $$ This notation, apart from being very convenient, is also entirely valid. To reiterate, $\exp(p) \cdot \exp(q)=\exp(p+q)$, and so $$ e^p \cdot e^q = e^{p+q} $$ as expected. Geometrically, $e$ represents the number $a$ for which $\int_{1}^{a}\frac{1}{t} \, dt = 1$. This property follows directly if one recalls the definition of $\exp$. And since $\log_a(x)$ is defined as the inverse of $a^x$, it is evident that $\log_e(x)$ is the inverse of $e^x$. But we know that the inverse of $\exp(x)$ is $f(x)=\int_{1}^{x}\frac{1}{t} \, dt$, and putting these statements together, we find that $$ \log_{e}(x) = \int_{1}^{x}\frac{1}{t} \, dt \, . $$ In fact, this logarithm is so important, so natural, that we often dispense with the base $e$ entirely and write $$ \log(x) = \int_{1}^{x}\frac{1}{t} \, dt \, . $$ Earlier, we mentioned the difficulty of defining $a^x$ when $x$ is an irrational number. The logarithm solves this problem. For rational $x$, we know that $$ a^x = e^{x\log(a)} \, . $$ But the RHS makes sense for all $x$, and so we may define $$ a^x = e^{x\log(a)} \, , $$ for $a>0$. And defining exponents in this way means that $$ a^x \cdot a^y = a^{x+y} \, , $$ still holds even for irrational exponents! The proof of this fact follows smoothly if we unravel the definition of $a^x$: $$ a^x = e^{x\log(a)} = \exp(x\log(a)) \, , $$ where $\exp(x)=\log^{-1}(x)$ and $\log(x) = \int_{1}^{x}\frac{1}{t} \, dt$. Armed with this knowledge, we return to the differential equation we were trying to solve earlier, and finally obtain the solution we were expecting. \begin{align} \frac{dy}{dx} &= ky \\ \int k \, dx &= \int\frac{1}{y} \, dy \\ kx + C &= \log(y) \\ y&=e^{kx+C} = e^{kx} \cdot e^{C} = Ae^{kx} \text{ where $A=e^C$} \end{align} Since $a^x$ simply represents $e$ being raised to another base, functions such as $2^x$ are also included in the solution set. In particular, $$ \frac{d}{dx}(a^x) = \frac{d}{dx}(e^{x\log(a)}) = e^{x\log(a)} \cdot \log(a) = a^x \cdot \log(a) \, , $$ and so $k=\log(a)$. Note that the derivative of $e^x$ is simply a special case of this where $\log(e)=1$. Hopefully, this should provide the motivation behind defining the logarithm as $\int_{1}^{x}\frac{1}{t} \, dt$.


There is another way of understanding why functions of the form $f(x)=a^x$ grow in the way they do. If we differentiate $a^x$ with respect to $x$ from first principles, we get \begin{align} \lim_{\Delta x \to 0}\frac{a^{x+\Delta x}-a^x}{\Delta x} &= a^x \cdot \lim_{\Delta x \to 0}\frac{a^{\Delta x} - 1}{\Delta x} \\ &= a^x \cdot f'(0) \, . \end{align} In other words, the growth rate is proportional to the value of $a^x$ at any given point, with the gradient at $x=0$ being the proportionality constant. The natural logarithm comes in handy here: $$ f'(0) = \lim_{\Delta x \to 0}\frac{a^{\Delta x} - 1}{\Delta x} \, . $$ Exercise: prove that this limit is equal to $\log(a)$. This is tricky if you want to avoid using L'Hôpital's rule.

Joe
  • 19,636
  • (+1) nicely done! – Unknown Dec 24 '20 at 17:50
  • Thank you very much for this detailed answer, it's very well explained. I have one question though. You say that 'it seems reasonable to conjecture that $\exp(x)$ is actually equal to $a^x$ for some base $a$' but what is the proof for this conjecture? – A-Level Student Dec 24 '20 at 22:34
  • @A-LevelStudent When I say 'conjecture', I only mean it in a very informal sense. Really, this boils down to definitions: one definition of $a^x$ is $\exp(x\log(a))=e^{x\log(a)}$. We can then show that $a^n = \underbrace{a \cdot a \cdot a \cdot \ldots a}{\text{$n$ times}}$, and so on. As a result, $2^5$ ends up being what we expect it to be. We can define $a^x$ in another way, too, which you might find more satisfying. For positive integers, $a^x=\underbrace{a \cdot a \cdot a \ldots a}{\text{$x$ times}}$. Also, $a^0=1$. For negative integers, $a^{-x}=\frac{1}{a^x}$. For rational numbers... – Joe Dec 24 '20 at 22:46
  • $a^{p/q}=(\sqrt[q]a)^p$. (Throughout this program, we have been guided by the requirement that $a^m \cdot a^n$ 'should' be equal to $a^{m+n}$.) Finally, we could require that the graph of $y=a^x$ is continuous, so that the 'holes' are filled up. If you take this approach, then you might actually be able to prove $\exp(x)=e^x$, where you have $e^x$ in some other way. However, the way I did it took less work: I simply defined $e^x$ as $\exp(x)$. I just wanted to show you why this definition might be suitable, given what we think we know about the number $e$ and exponents in general. – Joe Dec 24 '20 at 22:50
  • @A-LevelStudent *There's a missing word in the last paragraph: I meant to say 'where you have defined $e^x$ in some other way'. I have also added another way to understand the derivative of $a^x$ using the limit definition to my answer. Have I answered all of your questions? – Joe Dec 24 '20 at 23:14
  • Thanks for your response. I'm not totally sure how we can define $a^x$ as $\exp(x\log(a))$. Doesn't it just have a very simple definition, that is completely separate from $e$? – A-Level Student Dec 25 '20 at 10:16
  • @A-LevelStudent No, the definition is not that simple. In the sentence beginning 'We can define $a^x$ in another way too', I mentioned that we have to come up with a separate definition for positive integers, zero, negative integers, rational numbers, and irrational numbers. This approach is unwieldy, even if it is intuitively obvious. It is simpler to have one concrete definition of $a^x$ when $a>0$: $a^x := \exp(x\log(a))$. From this one definition, all of the facts about exponentiation that we learnt in school follow smoothly... – Joe Dec 25 '20 at 10:44
  • @A-LevelStudent For example, it can be shown that $a^n=\underbrace{a \cdot a \cdot a \ldots a}_{\text{$n$ times}}$ if $n\in\mathbb{N}$. But notice that this is a theorem, not a definition. This is what makes the exponential function so hard to get your head around: it should fundamentally change what you think it means to raise a number to another number. The approach I'm using means that exponentiation boils down to an integral. Other approaches involve a power series. You can even define $a^x$ by making the graph continuous, as I mentioned earlier. But none of these approaches are easy. – Joe Dec 25 '20 at 10:46
  • I kind of get it now, thanks. How would we show by use of the exponential function that $a^n=a\cdot a\cdot\cdots a$ $n$ times? By substituting $n=1$ into the definition? – A-Level Student Dec 25 '20 at 11:17
  • @A-LevelStudent I'll give you a hint. Please try to solve it for yourself; if you are completely stuck, I might give you another hint. In any proof involving $a^n$ you must use the fact that $a^n=\exp(n\log(a))$. We have already shown that $\exp(p+q)=\exp(p) \cdot \exp(q)$. Use this fact to find an expression involving $a^n$, and then simplify that expression. – Joe Dec 25 '20 at 11:34
  • That's what I meant, I already had the proof :) Thanks for your brilliant explanations! – A-Level Student Dec 25 '20 at 11:37
  • I feel very bad for not accepting your answer; it is very informative and has been very helpful, thank you. I accepted a different answer as I found it simpler and didn't require a different way of defining things, but please don't take that the wrong way; your answer has been incredibly helpful, and has given me a completely new perspective on the exponential function, which I really appreciate! – A-Level Student Dec 25 '20 at 11:44
  • @A-LevelStudent That's completely fine. Having read that answer, I upvoted it too. This approach is also nice. But notice that it is more complicated than it looks. We define $e^x$ as $\lim_{n \to \infty}\left(1+\frac{1}{n}\right)^n$. Having shown that $e^x$ is strictly increasing, we define $\log(x)$ as the inverse, and then prove that $e^x$ is its own derivative. Once again, we can define $a^x$ as $e^{x\log(a)}$, and verify that it has the properties of exponents that we expect it to have. – Joe Dec 25 '20 at 11:51
  • @A-LevelStudent Sorry, I made a typo. $e^x$ should be $\lim_{n \to \infty}\left(1+\frac{x}{n}\right)^n$. $e$ is just a special case where $x=1$. – Joe Dec 25 '20 at 12:09
  • I see, thanks again. – A-Level Student Dec 25 '20 at 12:58
2

This is based on the definition

$$e = \lim_{n \to \infty} \left(1+\frac1n\right)^n$$

and the definition of $\ln$ as inverse of $e^x$.

Let $y=e^x$, then

$$\begin{align*} x &= \ln y\\ \frac{dx}{dy} &= \lim_{h\to0}\frac{\ln(y+h)-\ln y}{h}\\ &=\lim_{h\to0}\ln\left(1+\frac{h}{y}\right)^\frac1h\\ &=\ln\lim_{h\to0}\left(1+\frac{h}{y}\right)^{\frac yh\cdot \frac1y}\\ &= \ln \left(e^\frac1y\right)\\ &= \frac1y\\ &= \frac1{e^x}\\ \frac{dy}{dx} &= e^x \end{align*}$$

peterwhy
  • 22,256
1

Suppose we've already proved that $\frac{d}{dx} a^x = g(a) a^x$ for some function $g : \mathbb{R}_{> 0} \to \mathbb{R}$ (see this post for example). The question is then "Does there exist $a \in \mathbb{R}_{>0}$ such that $g(a) = 1$ ?"

Observe that, by the chain rule, for any $k \in \mathbb{R}$: $$ \frac{d}{dx} a^{kx} = k g(a) a^{kx} $$ On the other hand $$ \frac{d}{dx} a^{kx} = \frac{d}{dx} (a^k)^x = g(a^k) (a^k)^x = g(a^k) a^{kx} $$ Thus $kg(a) = g(a^k)$ for all $a \in \mathbb{R}_{>0}$ and $k \in \mathbb{R}$.

In particular, we may set $k = \frac{1}{g(a)}$ for some $a \in \mathbb{R}$ such that $g(a) \neq 0$ (if $g = 0$, then all exponential functions are constant, which we know is not true). Then $$ \frac{d}{dx} (a^k)^x = g(a^k) (a^k)^x = kg(a) (a^k)^x = (a^k)^x $$ Therefore, if $g(a) \neq 0$, $e \equiv a^{\frac{1}{g(a)}}$ is a number such that $$ \frac{d}{dx} e^x = e^x $$ By carrying the analysis further, we could convince ourselves that $a^{\frac{1}{g(a)}}$ is a unique number, i.e. does not depend on $a$.

  • How do you know that $ \frac{d}{dx} a^{kx} = k g(a) a^{kx} $ without using the derivative of $e^x$? – A-Level Student Dec 24 '20 at 17:43
  • That's the chain rule. If you like, define $\exp_a (x) = a^x$. Suppose we know $\exp_a ' = g(a) \exp_a$ for some function $g$. Define $L_k(x) = kx$. Then the claim is that $(\exp_a \circ L_k)' = (\exp_a ' \circ L_k) L_k' = (( g (a) \exp_a )\circ L_k) k$. In other words, $\frac{d}{dx} a^{kx} = (g(a) a^{kx}) k = kg(a) a^{kx}$. Hopefully you see why we often abuse notation when using the chain rule. – Charles Hudgins Dec 24 '20 at 19:28
  • @A-LevelStudent forgot to tag you in the above comment. Does my comment answer your question? – Charles Hudgins Dec 24 '20 at 19:35
  • Sorry, what does that circular symbol mean? – A-Level Student Dec 25 '20 at 10:28
  • @A-LevelStudent $(f \circ g)(x) = f(g(x))$. So, symbolically, the chain rule says $(f \circ g)' = (f' \circ g) g'$. Or, in perhaps more familiar notation $\frac{d}{dx} f(g(x)) = f'(g(x)) g'(x)$. We set $\exp_a(x) = a^x$ and $L_k (x) = kx$. Then $(\exp_a \circ L_k)(x) = a^{kx}$ (check this). So $(\exp_a \circ L_k)' = (\exp_a ' \circ L_k)L_k ' = (g(a) \exp_a \circ L_k) k = k g(a) \exp_a \circ L_k$. In other words, $\frac{d}{dx} a^{kx} = k g(a) a^{kx}$. – Charles Hudgins Dec 25 '20 at 16:27
1

Let $y=e^x$.

Taking $\text{ln}$ both sides, we obtain

$\text{ln}y=x\text{ln}e=x$.

Differentiating w.r.t. $x$, we have

$$\frac{1}{y}\frac{dy}{dx}=1$$ $$\frac{dy}{dx}=y$$ $$\frac{d}{dx}e^x=e^x.$$

Unknown
  • 3,073
0

Comment:

$y=e^x$

Due to definition of derivative we can write:

$$y'=\lim_{\Delta x\rightarrow 0}\frac{e^{x+\Delta x}-e^x}{\Delta x}=\lim_{\Delta x\rightarrow 0}e^x(\frac {e^{\Delta x}-1}{\Delta x})=e^x$$

Provided we prove that:

$\lim_{\Delta x\rightarrow 0} \frac {e^{\Delta x}-1}{\Delta x}=1$

sirous
  • 10,751
0

The first time we are acquainted with exponentiation is with the idea of it as repeated multiplication. For $n \in \mathbb{N}$, we may define $$ a^n=\underbrace{a \cdot a \cdot a \cdot \ldots \cdot a}_{\text{$n$ times}} \, . $$ This helps us establish an important principle about exponentiation, $$ a^{x+y}=a^x \cdot a^y \, , $$ which will guide us in how we define $a^n$ when $n\not\in\mathbb{N}$. To begin with, $a^0$ should be equal to $1$, since $$ a^0\cdot a^n=a^n \implies a^0=\frac{a^n}{a^n}=1 \, . $$ For the same reason, $a^{-n}$ should be equal to $1/a^n$. If $n$ is a rational number $p/q$, then $$ a^n=(\sqrt[q]{a})^p \, . $$ It's worth noting at this point that this definiton does not apply when the base $a$ is a negative number, since $(-5)^{1/2}$ is meaningless within the context of real numbers. It is possible to partially extend the definition, so that $a^{p/q}$ is meaningful when $q$ is an odd number, but from here on in we will only define $a^n$ for when $a\geq0$.* We are still unsure about how to define $a^n$ for $n\in\mathbb{R}$, since $n$ could be an irrational number. Nevertheless, it seems natural that $a^{\sqrt{2}}$ should be approximately equal to $a^{1.41421}$, and this motivatives the below definition for irrational exponents: $$ a^n=\lim_{m(\in\mathbb{Q})\to n} a^m \, . $$


Now that we have formally defined $a^x$ for $a\geq0$, $x\in\mathbb{R}$, we can begin to examine its properties. You will no doubt be familiar with the phrase 'exponential growth'—growth in which the rate of change of a quantity is proportional to the quantity at any given time. This means that an exponentional function may be characteristed as any solution to $$ y'(x) = cy(x) $$ for some constant $c$. The term 'exponentional' can give us a clue as to which functions exhibit this property, namely the ones we have just studied! If $y(x)=a^x$, then \begin{align} \frac{dy}{dx} &= \lim_{\Delta x \to 0}\frac{a^{x+\Delta x}-a^x}{\Delta x} \\ &= a^x \lim_{\Delta x \to 0}\frac{a^{\Delta x}-1}{\Delta x} \end{align} Note that the final limit is simply $y'(0)$, meaning that $y'(x)$ does indeed equal $cy(x)$, with $c$ being equal to the slope of $a^x$ at the point $(0,1)$. This helps explain one of the classic definitions of the constant $e$: it is the unique number $a$ such that $$ \lim_{\Delta x \to 0}\frac{a^{\Delta x}-1}{\Delta x} = 1 \, . $$ This means that $e^x$ is its own derivative. This approach has two deficiencies, though:

  • We haven't proven that the limit $$\lim_{\Delta x \to 0}\frac{a^{\Delta x}-1}{\Delta x}$$ exists.
  • We haven't shown that there is such a number where the above limit is equal to $1$.

This can be remedied by appealing to one of the alternative definitions of $e$.

Joe
  • 19,636