4

To my knowledge there are two possible ways to define $e^x$

$$e^x = \sum_{i=0}^{\infty}\frac{x^i}{i!}$$

$$e^x = \lim_{n \to \infty} \left(1 + \frac{x}{n}\right)^n$$

So my question is: Why does…

$$\sum_{i=0}^{\infty}\frac{x^i}{i!} = \lim_{n \to \infty} \left(1 + \frac{x}{n}\right)^n$$

Anixx
  • 9,119
Remi.b
  • 1,605

3 Answers3

4

HINT : Use the General Binomial Theorem on the expression on the right hand side.

user88595
  • 4,549
  • Mmmh… Ok, I'll go for some readings.. I don't know about the General Binomial Theorem and I don't even know what RHS is the abbreviation of! Thanks for your answer :) Do you have some links that will help me? – Remi.b Apr 15 '14 at 11:24
  • Sorry about that I'll edit immediately – user88595 Apr 15 '14 at 11:25
  • 1
    Unfortunately, the expansion of the right-hand-side (RHS) for a given $n$ does not give any partial sum on the LHS. But the coefficients of each of the terms on the RHS do approach the coefficients of the terms on the LHS in the limit. So this is something of a hand-waving proof. – MPW Apr 15 '14 at 11:26
4

I'm not giving you the full formal proof - you can look that up in just about any calculus textbook - but the basic idea is to use $$ (a + b)^n = \sum_{k=0}^n \binom{n}{k} a^k b^{n-k} \text{ where } \binom{n}{k} = \frac{n!}{k!(n-k)!} $$ to get $$ \left(1 + \frac{x}{n}\right)^n = \sum_{k=0}^n \frac{n!}{n^kk!(n-k)!}x^k = \sum_{k=0}^n \frac{n(n-1)\ldots(n-k+1)}{n^k} \frac{x^k}{k!} \text{.} $$ Then, if $n >> k$ (meaning $n$ is much larger than $k$), you have $$ \frac{n(n-1)\ldots(n-k+1)}{n^k} \approx 1 \text{.} $$ Thus, if you fix $K$, and let $n \to \infty$, the terms involving $x^0,\ldots,x^K$ in $\left(1 + \frac{x}{n}\right)^n$ will converge towards $\frac{x^k}{k!}$.

fgp
  • 21,050
2

The power series $$ \sum_{n\ge0}\frac{x^n}{n!} $$ defines a function on the whole real line, let's call it “exp”. Since power series can be differentiated term by term, we see that $\exp'=\exp$ and also that $\exp0=1$. Consider now the function $$ f(x)=\exp(a-x)\exp x. $$ We have $$ f'(x)=-\exp(a-x)\exp x+\exp(a-x)\exp x=0 $$ so that $f'(x)=0$. Therefore $f$ is constant and its value is $f(0)=\exp a$. So we have $$ \exp(a-b)\exp b=\exp a $$ for all real $a$ and $b$. If $a-b=x$ and $b=y$, we can write the relation as $$ \exp(x+y)=\exp x\exp y $$ and, in particular $\exp(-x)=(\exp x)^{-1}$. So the function $\exp$ never has the value $0$ and so it's always positive. Therefore it's increasing, hence invertible. Denote by $\log$ its inverse. By the inverse function theorem, we have $$ \log'1=1 $$ that is $$ \lim_{h\to0}\frac{\log(1+h)-\log1}{h}=1. $$ However, $\log1=0$ by definition, so, for any $x$, $$ x=\lim_{h\to0}x\frac{\log(1+h)}{h} $$ and, with the substitution $n=x/h$ we get $$ \lim_{n\to\infty}n\log\left(1+\frac{x}{n}\right)=x. $$ For natural $n$, it's clear that $n\log t=\log t^n$ (induction on $\log a+\log b=\log(ab)$ which is the same as $\exp(x+y)=\exp x\exp y$), so we have $$ \lim_{n\to\infty}\log\left(\left(1+\frac{x}{n}\right)^n\right)=x $$ which amounts to say that $$ \lim_{n\to\infty}\left(1+\frac{x}{n}\right)^n=\exp x. $$

egreg
  • 238,574