3

I am looking for the proof that the following $n$th order ODE $$ y^{(n)}(t)+p_{n-1}(t)y^{(n-1)}(t)+\cdots+p_1(t)y'(t)+p_0(t)y(t)=g(t) $$ for $t\in (a,b)$ and $p_i$ and $g$ continuous on $(a,b)$ with $$ y(t_0)=y_0,y'(t_0)=y_0',\dots,y^{(n-1)}(t_0)=y_0^{(n-1)} $$ for some $t_0\in (a,b)$ has a unique solution on the entire interval $(a,b)$. This is as stated in Boyce and De Prima. However, a proof is not offered. Furthermore, I dug up the referenced text and couldn't find a proof there either (the edition was different, so it is possible I missed it). Hirsch Smale and Devaney just asserts a result for non explicitly $t$ dependent coefficients and then moves on to fully nonlinear equations and dynamics. I am hoping for a more nitty gritty proof. And for convenience, I would be fine with taking $n=2$ if it extrapolates nicely.

My ideas to proceed; we can certainly reduce this to a system of first order ode's (and I think it suffices to examine the homogeneous version by variation of parameters) and try and prove existence and uniqueness in the first order cases.

However, what troubles me is I don't see how to use Picard's theorem to get existence and uniqueness on the whole interval where all the $p_i$ are continuous. Wouldn't we need to restrict to a (sufficiently small) compact subset to make sure the integral operator we define is a contraction for the Banach fixed point theorem argument to go through?

I confess I haven't worked out all the details myself, so I apologize if I missed something obvious. I am also not that familiar with the Wronskian, and maybe that's where the solution will come from.

operatorerror
  • 29,103

1 Answers1

1

I am finally getting back around to this, but it's not too hard a proof with a little functional analysis.

We examine the order 2 case, since the order $n$ case should follow rather similarly. We seek a solution $$ y''(t)+p(t)y'(t)+q(t)y(t)=g(t)\\ y(t_0)=a,y'(t_0)=b,\;t_0\in (a,b) $$ on the interval $(a,b)$ where $g,h,q$ are all continuous (actually $L^1_{loc}(a,b)$ suffices if we are ok with an almost everywhere solution). We prove there is a unique solution $y$ which is $C^1$ (again here, we may weaken things to just having an absolutely continuous derivative a.e.).

Through the usual trade of order for dimension, we may examine the system $$ \begin{pmatrix}y(t)\\y'(t) \end{pmatrix}'= \begin{pmatrix} 0&1\\-q(t)&-p(t)\end{pmatrix} \begin{pmatrix}y(t)\\y'(t) \end{pmatrix}+\begin{pmatrix} 0\\g(t)\end{pmatrix} $$ and integrating up to some $t\in (t_0,b)$ yields the equivalent integral equation $$ \begin{pmatrix}y(t)\\y'(t) \end{pmatrix}= \int_{t_0}^t\begin{pmatrix} 0&1\\-q(s)&-p(s)\end{pmatrix} \begin{pmatrix}y(s)\\y'(s) \end{pmatrix}\mathrm ds+\begin{pmatrix} a\\b+\int_{t_0}^tg(s)\mathrm ds\end{pmatrix}\\ \implies Y(t)=(TY)(t)+G(t) $$ Where $$ Y(t)=\begin{pmatrix}y(t)\\ y'(t) \end{pmatrix}\\ P(t)=\begin{pmatrix} 0&1\\-q(t)&-p(t)\end{pmatrix}\\ (TY)(t)=\int_{t_0}^tP(s)Y(s)\mathrm ds\\ G(t)= \begin{pmatrix} a\\b+\int_{t_0}^tg(s)\mathrm ds\end{pmatrix} $$ So, solving for $Y(t)$ amounts to finding $(T-I)^{-1}$. The inverse is given by $$ \sum_{n=0}^\infty T^n $$ There are details to verify about the convergence of the above, but it follows by an induction argument establishing the bound $$ |(T^nG)(t)|\leq \frac{\left(\int_{t_0}^t||P(s)||\mathrm ds \right)^n}{n!}||G|| $$ for any $G$ continuous on a closed interval $[t_0,d]$, $d<b$ where $t\in[t_0,d]$.

Thus, $$Y(t)=\sum_{n=0}^\infty T^nG(t)$$ is the required, manifestly continuous, given our bound above, solution to the integral equation. A symmetric argument for the left part of the interval allows us to conclude we have our solution to the differential equation on $(a,b)$.

A good reference for this is Teschl's Mathematical Methods in Quantum Mechanics Chapter 9, although his proof is specifically for Sturm Liouville problems.

operatorerror
  • 29,103