$x\in\mathbb{R}$. I can prove that both sides converge, and maybe we should show $\limsup\limits_{n\rightarrow\infty}\left|\sum\limits_{k=0}^{n}\frac{x^k}{n!}-\left(1+\frac{x}{n}\right)^n \right|<\varepsilon$, but the construction of the right hand side is a little tricky for me.
Asked
Active
Viewed 100 times
0
-
1Show that they both solve the differential equation $f'(x) = f(x)$ with initial condition $f(0) = 1$ (the solution is unique; this isn't hard to prove). The LHS solves it by finding its Taylor series and the RHS solves it using Euler's method. – Qiaochu Yuan Nov 18 '18 at 22:15
-
Expand the right side (binomial expansion) and match up term by term (powers of $x$) to the left side. – herb steinberg Nov 18 '18 at 22:24
1 Answers
2
Rather than proving this directly, I'd proceed as follows:
(1) Define $\exp(x)$ to be the unique solution to $y' = y$ with $y(0) = 1$. Let $\log$ denote its inverse (which exists because $\exp$ is increasing).
(2) Use uniqueness of $\exp$ to show that $\exp(x + y) = \exp(x) \exp(y)$ and thus $\log(xy) = \log(x) + \log(y)$.
(3) Use the defining differential equation of $\exp$ to show that $\exp(x) = \sum x^n/n!$.
(4) Use the definition of $\log$ to show that $\log'(1) = 1$, then compute the log of the right-hand limit with l'Hopital's rule.

anomaly
- 25,364
- 5
- 44
- 85