1

A second order, linear, homogeneous ODE can be expressed as:

$$a(x)y''(x) + b(x)y'(x) + c(x)y(x) = 0$$

I have only a basic knowledge about differential equations. However, all the texts state that: in order to find a solution at a specific point $x = x_0$, the functions

$\displaystyle \frac{b(x)}{a(x)}$ and $\displaystyle \frac{c(x)}{a(x)}$

must be analytic at $x_0$.

For example, from this document:

A linear, analytic equation is one for which the coefficient functions are analytic and therefore possess convergent power-series expansions [...]. The simple conclusion will be that the solutions also possess convergent power-series expansions.

Why, if the coefficient functions are analytic, there is guarantee that the solution is analytic, too, and vice-versa?

They are only the coefficients of the unknown function $y(x)$ in that equation, not the function $y(x)$ itself, so how can they affect the function in such a heavy way? If for example $a(x_0) = 0$, how can this determine also the form of $y(x)$ at $x_0$? I can't find the link between these two facts.

BowPark
  • 1,366
  • 1
    That's only a computational requirement, that allows you to construct an analytic solution with the Frobenius method. It does not tell you anything about the existence of solutions, in general. Maybe it affects the existence of analytic solutions, this I don't know, I must admit. – Giuseppe Negro Sep 26 '18 at 09:51
  • @GiuseppeNegro Sorry, I can't understand. The Frobenius method is used when already $b(x)/a(x)$ and/or $c(x)/a(x)$ are not analytic. The problem in the question seems to be "before" the Frobenius method. – BowPark Sep 26 '18 at 11:20
  • 1
    If $a(x_0)=0$ the equation does not have order $2$ in $x_0$, the domain of the order 2 ODE has to exclude $x_0$ and any other roots of $a$. As there is no ODE in $x_0$, there is no solution there and thus no solution expansion around $x_0$. You can still apply methods for singular points, but that is outside the scope of the given statement. – Lutz Lehmann Sep 26 '18 at 11:55
  • @LutzL I'm sure you are right. But I can't follow your steps. You state: «if $a(x_0) = 0$ the equation does not have order $2$ in $x_0$», and that's right. But then: «there is no ODE in $x_0»$. There is no order $2$ ODE in $x_0$, but there is still an order $1$ ODE. – BowPark Oct 01 '18 at 13:05
  • The domain of an ODE is always an open set in the time-state space. Thus on the set ${(x,y):a(x)=0}$ you can not define an ODE, as that set of lines is not open. – Lutz Lehmann Oct 01 '18 at 13:10
  • @LutzL I have only a basic knowledge about ODEs. What is the time-state space, the couples of $(x,y)$ (with $x$ as "time" and $y$ as "state") that verify the equation? Then: what do you mean by «the set of lines is not open»? Do you mean that, if there is a solution for $x \in (x_1, x_0) \cup (x_0, x_2)$, there can not exist a solution valid only inside the point $x_0$, because it is just a point and not an open interval? – BowPark Oct 01 '18 at 17:43

2 Answers2

6

It's easier to see if we take it down to first order. Let's look at $$ a(x)y'(x) + b(x)y(x) = 0$$

To solve this equation, we can simply separate the variables and integrate, so $$ \int\frac{y'(x)}{y(x)}\text dx = c - \int \frac{b(x)}{a(x)}\text dx \\ \Rightarrow y(x) = C\exp\left( \int f(x)\text dx \right) = C\exp(F(x)) $$ where $f(x) = -\frac{b(x)}{a(x)}$ and $F'(x) = f(x)$. The question here is, when is the solution $y(x)$ an analytic function at some given point $x_0$? Well, an analytic function is one whose derivatives all exist at $x_0$, so let's find the derivatives of $y$. The expression for the $n$-th derivative of $\exp(F(x))$ is

$$\frac{\text d^n \exp(F(x))}{\text dx^n}\biggr|_{x=x_0} = \exp(F(x_0)) \sum_{k=0}^n \frac{1}{k!} \sum_{j=0}^k (-1)^j \binom{k}{j}F(x_0)^j \frac{\text d^n F(x)^{k-j}}{\text dx^n}\biggr|_{x=x_0} $$

The only time this expression doesn't exist for some $n$ is if the $p$-th derivative of $F$ doesn't exist for some $0\leq p\leq n$. In other words, $\exp(F(x))$ is analytic iff $F(x)$ is. Since $F$ is the integral of $f$, $F$ is analytic iff $f$ is analytic, and recall that $f = -\frac{b(x)}{a(x)}$. Thus,

$$ y(x) \text{ is analytic } \Leftrightarrow \frac{b(x)}{a(x)} \text{ is analytic. } $$

The same idea is true for higher order equations; the only issue is that "integrating" higher order equations isn't as simple as actually integrating a function as it was in the first order case. However, making use of the Picard iteration, we can show the exact same thing for higher order equations (albeit with a lot more work).

Alex Jones
  • 8,990
  • 13
  • 31
  • 1
    I like this exposition. You might add an explicit remark on that you are not really proving that $y$ is analytic, you are just proving that $y$ is $C^\infty$ but the idea is clear. – Giuseppe Negro Sep 28 '18 at 09:03
  • It was somewhat tacitly implied: thank you for your answer, too. It is as valuable as the other one: if I could choose both of them, I would. – BowPark Oct 05 '18 at 17:41
  • Did you find the expansion of $\displaystyle \frac{\mathrm{d}^n \exp(F(x))}{\mathrm{d}x^n}$ on WolframAlpha or on another source? I was trying to verify it with a simple $n = 2$, but the formula is a bit complicated even in this case. Another question specifically asks for it, but the answer is even more complicated and maybe not suitable. – BowPark Oct 05 '18 at 17:44
3

Given a second order ODE

$$ a(x)y''(x)+b(x)y'(x)+c(x)y(x)=0 \tag{1} \label{eq1} $$

We can introduce a new variable $z(x)=y'(x)$ and reduce the equation to a first order system (the same idea works for higher order equations).

$$ \begin{align} y'(x)-z(x)&=0 \\ a(x)z'(x)+b(x)z(x)+c(x)y(x)&=0 \\ \end{align}\tag{2} \label{eq2} $$

We can write the system in a matrix-vector form

$$ \mathbf{A}(x)\cdot\vec{Y}'(x)+\mathbf{B}(x)\cdot\vec{Y}=0 \tag{3} \label{eq3} $$

where

$$ \mathbf{A}(x)=\left(\begin{matrix} 1 & 0 \\ 0 & a(x) \\ \end{matrix}\right) \tag{4} \label{eq4} $$

$$ \mathbf{B}(x)=\left(\begin{matrix} 0 & -1 \\ c(x) & b(x) \\ \end{matrix}\right)\tag{5} \label{eq5} $$

$$ \vec{Y}(x)=\left(\begin{matrix} y(x) \\ z(x) \\ \end{matrix}\right) \tag{6} \label{eq6} $$

if $a(x)\neq 0$ in the integration domain we can find $\mathbf{A}^{-1}$ given by

$$ \mathbf{A}^{-1}(x)=\left(\begin{matrix} 1 & 0 \\ 0 & \frac{1}{a(x)} \\ \end{matrix}\right) \tag{7}\label{eq7} $$

Using $\eqref{eq7}$ we can rewrite $\eqref{eq3}$ in the following way: $$ \vec{Y}'(x)=-\mathbf{A}^{-1}(x)\cdot\mathbf{B}(x)\cdot\vec{Y}=-\mathbf{C}(x)\cdot\vec{Y} \tag{8} \label{eq8} $$ where

$$ \mathbf{C}(x)=\left(\begin{matrix} 0 & -1 \\ \frac{c(x)}{a(x)} & \frac{b(x)}{a(x)} \\ \end{matrix}\right). \tag{9}\label{eq9} $$ Notice that $\mathbf{C}$ contains the ratio of the coefficients. If $\mathbf{C}(x_1)\cdot\mathbf{C}(x_2)=\mathbf{C}(x_2)\cdot\mathbf{C}(x_1)$, $\forall\{x_1,x_2\}\in D(\mathbf{C})$ the solution to $\eqref{eq8}$ is similar to the first order scalar case:

$$ \vec{Y}=\mathbf{K\cdot\mathrm{e}}^{-\int \mathbf{C}(x)\mathrm{d}x} \tag{10} \label{eq10} $$ where we use the matrix exponential and $\mathbf{K}$ is a constant matrix. In that case, we required $\frac{c(x)}{a(x)}=k_1,\frac{b(x)}{a(x)}=k_2$, where $k_1,k_2$ are constants. A general solution is given by the Magnus expansion (see ref. 1 and ref. 2).

Perhaps, it is easier to analyze the effect of the coefficients for a first order equation: $$ y'=-\alpha(x)y $$

the solution in this case is

$$ y=\kappa\mathrm{e}^{-\int \alpha(x)\mathrm{d}x} $$

Therefore, the solution is the composition of $\int \alpha(x)\mathrm{d}x$ and the exponential function. Given a $C^{n}$ function at $x_0$, its integral is $C^{n+1}$. For example, consider the following step-like function: $$ \alpha(x)=\left\{\begin{matrix} 0 & x<0 \\ 1 & x\geq 0 \\ \end{matrix}\right. $$

It is discontinuous at $x = 0$; its integral in $(-\infty,x]$ is a $C^0$ function (i.e. a continuous function):

$$ \int_{-\infty}^x\alpha(s)\mathrm{d}s=\left\{\begin{matrix} k & x<0 \\ k+x & x\geq 0 \\ \end{matrix}\right. $$ where $k$ is a constant. Regarding the questions:

  1. Why, if the coefficient functions are analytic, there is guarantee that the solution is analytic, too, and vice-versa? Given the integration of the matrix $\mathbf{C}$, you need the ratio of the coefficient to be analytic, i.e., $C^{\infty}$ in order to get a solution which is $C^{\infty}$ as well.

  2. They are only the coefficients of the unknown function $y(x)$ in that equation, not the function $y(x)$ itself, so how can they affect the function in such a heavy way? Given the solution $\eqref{eq10}$, we see that the coefficients fully determine the solution. Therefore, they strongly affect the resulting solution (they are part of the solution).

  3. If for example $a(x_0)=0$, how can this determine also the form of $y(x)$ at $x_0$? In that case you find a singularity at $x_0$, the solution won't be analytic in that case (it may be $C^n$ in the best case).

PabloG.
  • 141
  • Thank you for the very detailed and very useful answer. A doubt: if vector $\vec Y' (x)$ is $\left(\begin{matrix} y'(x) \ z'(x) \ \end{matrix}\right)$, shouldn't the $\mathbf{A}(x)$ matrix be $\left(\begin{matrix} 1 & 0 \ 0 & a(x) \ \end{matrix}\right)$, and so the inverse $\mathbf{A}^{-1}(x)$ be $\left(\begin{matrix} 1 & 0 \ 0 & \displaystyle \frac{1}{a(x)} \ \end{matrix}\right)$? – BowPark Oct 01 '18 at 13:01
  • Thanks @BowPark, now it should be right. – PabloG. Oct 01 '18 at 23:29
  • Yes, it is, also as regards $\mathbf{C}(x)$. I tried to fix some typos and to add some information about the example $\alpha$. Thank you so much! – BowPark Oct 02 '18 at 08:22
  • The solution can only be written in terms of the matrix exponential when $\mathbf{C}(x)$ commutes with $\int \mathbf{C}(x) \text dx$. Thus, you need to show that this is the case in order to use that solution. – Alex Jones Oct 03 '18 at 21:03
  • 1
    Thanks @AlexanderJ93. I added some extra notes and references regarding your comment. – PabloG. Oct 03 '18 at 22:39