In the Spivak Calculus textbook, they show multiple ways of defining the exponential function. In one of the exercises they show that the exponential function can be defined as the unique function satisfying the following conditions: \begin{align*} f'&=f\\ f(0)&=1 \end{align*} The exercise shows that there is at most one function satisfying these conditions. It also manages to show the existence of such a function given that there exists a nonzero solution to $$f'=f$$ Similarly, the textbook shows that you can define the sine function as the unique function satisfying the following conditions: \begin{align*} f''+f&=0\\ f(0)&=0\\ f'(0)&=1 \end{align*} Again, the textbook shows that there exists at most one solution and shows the existence of a solution given a nonzero solution to $$f''+f=0$$ My question is, without explicitly finding a solution to the above differential equations, is it possible to show that there must exist a nonzero solution for each equation?
-
I think Spivak has given sufficient hints in the exercises to show that the solutions to these equations exist. – Paramanand Singh Jan 01 '18 at 09:56
-
For the exponential function see this answer: https://math.stackexchange.com/a/1292586/72031 For circular functions see this blog post: http://paramanands.blogspot.com/2016/03/theories-of-circular-functions-part-3.html In both these case a solution is explicitly created using integrals. – Paramanand Singh Jan 01 '18 at 11:19
3 Answers
For first-order equations see here. Basically you convert the initial-value problem to an integral equation, recast that as a fixed-point problem and show that a certain operator has a fixed point because it is a contraction.
There are various tricks that reduce second-order equations to systems of first-order equations. In general, for the equation $$y''=f(x,y,y'):$$ Say $Y=(y,y')$. Then $y''=f(x,y,y')$ is the same as $$Y'=f(Y_2,f(x,Y_1,Y_2),$$a vector-valued first order equation. Most proof of existence for first-order equations work just as well for vector-valued equations.
For $y''+y=0$ specifically, note that this is the same as $$z'+iz=0,\, y'-iy=z.$$So if we can solve first-order equations then we can solve $y''+y=0$ by first solving $z'+iz=0$ and then plugging the solution into $y'-iy=z$ and solving that.

- 89,985
-
Hmm looking at the wikipedia article, it seems like the the solution to the IVP is limited to the interval $[t_0-a,t_0+a]$ where $a<1/L$, $L$ being the Lipschitz constant. In the case of $f(t,y)=y$, the Lipschitz constant is 1, so we can define the exponential function on the interval $[-1,1]$, but how can we extend this solution to the whole real number line? – Hrhm Jan 05 '18 at 19:53
-
@Hrhm For linear equations there are better theorems. And for $y''=y$ it's clear that if there exists a solution locally then there is a global solution. Because if $y(t)$ is a solution so if $y(t-c)$; hence we can extend solutions to longer intervals. – David C. Ullrich Jan 05 '18 at 20:26
i would write $$\frac{df}{dx}=f$$ and for $f\ne 0$ we obtain $$\frac{df}{f}=dx$$ integrating this we get $$\ln(|f(x)|)=x+C$$

- 95,283
A general answer is given by the Picard-Lindelof theorem (https://en.wikipedia.org/wiki/Picard%E2%80%93Lindel%C3%B6f_theorem), which proves local existence of a solution to the initial-value problem for an ODE with uniformly Lipschitz right-handside (which includes your example, since the function $f \mapsto f$ is trivially Lipschitz). A solution to the initial value problem cannot be identically zero, otherwise it wouldn't satisfy the initial condition.

- 460