4

I've started learning ODE's on my own and here is something that I don't understand. I've noticed that the book I am following (and all the other books that I have) is hand wavy when it comes to specifying the interval of the solution and doesn't realy worry too much about dividing by $0$. I will provide an example: let's solve the ODE $$t^2x'=x^2+tx+t^2,$$ where $x=x(t)$.
We divide by $t$ and the equation becomes $x'=\left(\frac{x}{t}\right)^2+\frac{x}{t}+1$. We make the variable change $y=\frac{x}{t}$ and after some computations we get that $\arctan y=\ln t+C$ for some constant $C\in \mathbb{R}$. Now, the book says that this implies that $y=\tan(\ln t+C)$, so $x=t\tan(\ln t+C), C\in \mathbb{R}$. I have two questions here:

  1. Why can we divide by $t$ at the beginning? I mean, yes, I agree that this solves our equation, but aren't we kind of missing some solutions? Here is the first philosophical problem that I have with ODE's: is the focus on somehow obtaining a solution, even though we make some assumptions along the way, that is defined on some interval $I\subset \mathbb{R}$ that we don't even care if it is really really small rather than on trying to find all the differentiable functions that satisfy our identity (as the focus was in, say, functional equations that appear at high school math contests)?
  2. Why after $\arctan y=\ln t+C$ we may write that $y=\tan(\ln t+C)$ for any real constant $C$? I mean, the $\tan$ function is not defined everywhere and we most certainly can choose some $C$ such that for some $t$ we have $\ln t+C=\frac{\pi}{2}$ for instance. Is the philosophy here the same that I presented in 1 i.e. assuming that the interval on which our solution is defined is chosen appropriately so that everything makes sense?
TheZone
  • 806
  • 3
    I think when you take a class on ODE's or read a good textbook on that subject, you will find that dividing by $t$ when $t=0$ is treated with caution and rigor, i.e. we don't do that. The term for ODE's which have a leading coefficient that is zero at the initial value is a singular ODE (derived from the possibility that the differential equation may not hold at the initial value because the derivative of the solution need not exist there). Your second part is not particularly connected to the topic of ODE's. Inverse trigonometirc functions are often studied in high school courses. – hardmath Jul 18 '21 at 16:12
  • @hardmath thank you! So the answer to my first question is that the book I am following is not all that rigorous and probably assumes that $t\ne 0$ implicitly whenever they do such things. In the second part what I meant is the following: let's say that we take $t=1$ and $C=\frac{\pi}{2}$. Then we would have $\arctan y(1)=\frac{\pi}{2}$, which is not possible. Hence, is the book again assuming that the interval on which the function $y$ is defined is chosen such that we don't run into situations like the one from above? – TheZone Jul 18 '21 at 16:18
  • 4
    You should probably identify which book doesn't seem to treat the singular differential equation with clarity or (perhaps) rigor, so Readers can perhaps offer a knowledgeable judgement. The differential equation should be specified to hold on a domain, which might exclude $t=0$. A typical case involves second order differential equations where the leading coefficient vanishes at the initial value (and some interesting things happen there). – hardmath Jul 18 '21 at 16:22
  • 1
    Here, I explicitly created a separate case when dividing by $0,$ but hand-waving the matter away turns out to not make a difference to the final solution. This is frequently the case, and probably one of the reasons why the hand-waving is common practice? – ryang Jul 18 '21 at 16:28
  • @hardmath I am reading a book made from some lecture notes in my native language (which in hindshight may have been a bad choice, I should have gone with a more reputed book), so most of the people here wouldn't be able to read them. I understand, however, that if the problem were properly stated I would have a domain for everything and there would be no ambiguities. – TheZone Jul 18 '21 at 16:32
  • 1
    You (or the book) make the assumption that $+C$ can be anything when that is frequently not the case. The OP in this question made the same mistake which I pointed out and explained. – Ninad Munshi Jul 18 '21 at 17:13
  • 2
    Also to answer your question, the maximal interval of existence of an ODE is unknown beforehand, all of the techniques try their best to come up with a solution. All possible solutions on all intervals is often impossible. An ODE has singular points wherever the coefficient to the leading order derivative is $0$. – Ninad Munshi Jul 18 '21 at 17:17
  • @NinadMunshi thank you very much, you've answered my questions completely! – TheZone Jul 18 '21 at 18:16
  • 1

1 Answers1

4

Whenever you manipulate any equation of any kind, you ought to say first, "Suppose there is a solution" because of course some problems have no solution but you won't know until you have researched. In your example, it could be Suppose there is a differentiable function defined on some interval not containing zero. This is often not stated in introductory texts.

After all, the real test of any proposed solution formula is not what you might have written on scratch paper to obtain it, but to plug into the ode and check that it really makes sense and is a solution having whatever initial or boundary values you were looking for.

Also, once you study the Fundamental Existence Theorem for ODEs you find out that solutions of initial value problems are only guaranteed to exist for possibly short intervals around the initial condition. Your example illustrates this. In fact the Fundamental Theorem is about equations $x' = f(x,t)$ which only applies to your example after you exclude $t=0$ and divide.

I think this covers both your questions.

I'll add a further note about what you might call philosophy. ODE is an old subject going back to Newton and others who solved sometimes very difficult equations, most of which had serious applications to science, and were done before "number" and "function" were even defined in the modern sense. Some of the early methods and terminology have been retained for 350 years in texts, so they can sound pretty odd if you compare them to your abstract algebra course, which was invented after number and function were well understood.

Bob Terrell
  • 3,812