I understand that assuming an analytic solution, we look at the Taylor series and arrive at a unique solution y = exp(x). However how do we know that there are no other non-analytic solutions? (Ideally with as little analysis machinery as possible)
-
If we can assume $\int \frac 1x dx= \ln x + c$, it’s a straightforward task. – Vishu May 02 '20 at 11:55
-
See this answer https://math.stackexchange.com/a/1292586/72031 – Paramanand Singh May 02 '20 at 12:29
-
One can develop a full theory of exponential function based on the definition that $y=\exp(x) $ is the unique solution to $y'=y, y(0)=1$. This works out for complex variables also although a bit differently than explained in the linked answer. – Paramanand Singh May 02 '20 at 12:40
4 Answers
Suppose $g'(x)=g(x)$ for all $x\in\mathbb R.$ $$ \underbrace{\frac d {dx}\,\frac{g(x)}{e^x} = \frac{e^x g'(x) - e^x g(x)}{e^{2x}}}_\text{quotient rule} = 0 \text{ for all } x\in\mathbb R. $$ Therefore $x\mapsto g(x)/e^x$ is constant on $\mathbb R.$
So $g(x) = \text{constant}\cdot e^x$ for $x\in\mathbb R.$
(The mean value theorem is tacitly used here, in that it is used in the proof that if the derivative of a function is $0$ everywhere in an interval, then the function is constant on that interval.)
You can also make an argument that does not require you to know anything about the logarithm. suppose that $f$ and $g$ are two solutions to $y'=y$ with $f(x_0)=g(x_0)\neq 0$.
Then, by continuity, there exists $\delta$ such that $(x_0-\delta,x_0+\delta)\cap g^{-1}(\{0\})=\emptyset$. For $x\in (x_0-\delta,x_0+\delta),$ we have $$ \left(\frac{f}{g}\right)'(x)=\frac{f'g(x)-fg'(x)}{g^2(x)}=0, $$ So that $f=g$ on $(x_0-\delta,x_0+\delta)$. In fact, applying continuity, $f=g$ on $[x_0-\delta,x_0+\delta]$ and iterating this, we get that $f=g$ on the largest interval $I$ containing $x_0$ such that $g(y)\neq 0$ for all $y\in I$. However, picking $g(x)=\exp(x),$ we get that $I=\mathbb{R}$ and so, $f(x)=\exp(x)$.

- 13,480
$\frac{dy}{dx} = y \Rightarrow \frac{dy}{y} = dx \Rightarrow \int \frac{dy}{y} = \int 1 \cdot dx \Rightarrow \ln|y| = x + c \Rightarrow y = C \cdot e^{x}$.
Now, because $y(0) = 1, C = 1$.

- 189
A proof using only the definitions of the derivative and the exponential function as limits, respectively $$ y'(x)=\lim_{h\rightarrow0}\left(\frac{y(x+h)-y(x)}{h}\right),~~(1) $$ and $$ e^x =\lim_{n\rightarrow \infty} \left(1+\frac{x}{n}\right)^{n}.~~(2)$$
From Eq.$~(2)$ we see that $e^x$ can also be written as $e^x =\lim_{n\rightarrow \infty} \left(1+\frac{1}{n}\right)^{nx}$.
Then, using Eq.$~(1)$ we convert the ODE $y'(x)=y(x)$ with initial conditions $y(0)=1$ to a recurrence equation $$y(x+h)-(1 + h) y(x)=0,$$ which can be solved (assuming a power-law $y\sim r^x$) to yield (after putting back the limit): $$y(x)=\lim_{h\rightarrow0}(1+h)^{x/h}=\exp(x),$$ where in the last part we used Eq.$~(2)$ and its variant.

- 358