4

Some literature uses $dx$, in the context of differential equations, in a confusing way without defining what it really stands for:

$Mdx + Ndy = 0$

Does it mean one of the following or something else entirely?

$\int Mdx + \int Ndy = 0$

$Md/dx + Nd/dy = 0$

$Mx' + Ny' = 0$

Git Gud
  • 31,356
Lundis
  • 143
  • 4
    This notation choice is unfortunate, but to rigorously understand it, you would need to study differential forms. For computational pursposes, your $3rd$ choice is closest to what it means. – voldemort Jan 18 '15 at 15:47
  • 2
    For piece of mind, you can probably interpret $M\mathrm dx + N\mathrm dy = \bf 0$ as $M+Ny'=\bf 0$. – Git Gud Jan 18 '15 at 15:58
  • Am mistaken in saying that this can be interpreted as $M+N\frac{dy}{dx}= 0$? – Tim Raczkowski Jan 18 '15 at 16:01
  • @TimRaczkowski Depends on context, in introductions to differential equations that is the only proper way to interpret it. Also I already said that in my comment. – Git Gud Jan 18 '15 at 16:02
  • Yes. I started typing before you posted. – Tim Raczkowski Jan 18 '15 at 16:04

1 Answers1

2

This may not be the answer, you were looking for, but a similar thing puzzled me some years ago, namely why the technique separation of variables can proceed through the meaningless intermediate equation which seems to be an abuse of notation: $$ \begin{align} \frac{dy}{dx}&=g(x)h(y)\\ &\iff\\ \frac{1}{h(y)}\ dy&=g(x)\ dx \end{align} $$ But later on I found, that from a historical perspective, this practice of treating $dx$ and $dy$ as algebraic entities obeying the same arithmetic rules that are known for numbers, was initiated by Leibniz who invented the notation $dx$ along with the notation $\int$. In the days of Leibniz, $dx$ and $dy$ were conceptualized as infinitely small changes in $x$ and $y$ respectively, such that $(x,y)$ and $(x+dx,y+dy)$ were two infinitely close points on the curve described by $x$ and $y$.

The line through $(x,y)$ and $(x+dx,y+dy)$ would then be tangent to the curve at $(x,y)$, whereas the line segment from $(x,y)$ to $(x+dx,y+dy)$ was regarded as an infinitely small element of the curve. This held the paradoxical nature that an infinitely small part of a tangent was actually a part of the curve. Another difficulty was to make this concept of infinitely small rigorous. Just as an example, if we consider the curve given by the rule $ay=x^2$ we would have $$ a(y+dy)=(x+dx)^2 $$ and subtracting $ay$ from LHS and $x^2$ from RHS (which is allowed since $ay=x^2$) this would lead to $$ a\ dy=2x\ dx+dx^2 $$ But here the concept of infinitely small really kicks in, since $dx^2$ is not only infinitely small, but infinitely much smaller than $2x\ dx$ and can thus be disregarded as relatively insignificant on the current level of being infinitely small. We are left with $a\ dy=2x\ dx$ which provides us the correct derivative $\frac{dy}{dx}=\frac{2x}{a}$ for the function $y=\frac{x^2}{a}$.


Since these manipulations with infinitely small quantities never produced an error when treated with caution, and since they have such an appealing similarity to manipulations with actual quantities, they have since been proven to work correctly as a means to work with differential equations more fluently. Thus I would regard $$ M\ dx+N\ dy=0 $$ as an intermediate state of expressing either $$ M\frac{dx}{dy}+N\frac{dy}{dy}=M\frac{dx}{dy}+N=\frac{0}{dy}=0 $$ or $$ M\frac{dx}{dx}+N\frac{dy}{dx}=M+N\frac{dy}{dx}=\frac{0}{dx}=0 $$ or maybe $$ M\frac{dx}{dt}+N\frac{dy}{dt}=\frac{0}{dt}=0 $$

NOTE: I do not have knowledge about the usage of that notation in differential forms, so what I wrote may not be consistant with usage of $dx$ and $dy$ in a different branch of mathematics. Here they may well have been formalized to actually mean something themselves rather than to anticipate meaningful expressions! Indeed, I understand they have been given meaning and been proven to work correctly in what is known as Non Standard Analysis.

String
  • 18,395
  • 1
    I heard a story, though it is probably apocryphal, that Cauchy once "proved" that a pointwise limit of continuous functions on a closed interval is continuous, using an infinitesimal argument. The fact that the counterexample is so simple ($f_n(x) = x^n$ on $[0,1]$) makes it likely that it really is apocryphal, but it is still a good story for encouraging caution in analysis. – Ian Jan 18 '15 at 17:11
  • It does make sense in the context if you divide everything by dx! – Lundis Jan 20 '15 at 14:24