2

The textbook defines differentials like this.

Let $y=f(x)$ be a differentiable function of $x$. The differential of $x$ (denoted by $dx$) is any nonzero real number. The differential of $y$ (denoted by $dy$) is equal to $f'(x)dx$.

It goes on to say that the derivative rules can be written in differential form using Leibniz notation. For example, it says the chain rule in differential form is

$$\frac{dy}{dx} = \frac{dy}{du} \frac{du}{dx}$$

The book says it appears to be true because the $du$'s would divide out, and although the reasoning is incorrect, it helps you remember the chain rule.

Why is the reasoning incorrect? Given those definitions of differentials, what's stopping you from manipulating them algebraically?

Kyle Delaney
  • 1,411
  • Why do you think $dx$ is a real number? – mvw Dec 05 '15 at 07:10
  • Because that's what the textbook defined it as. Please read my question. – Kyle Delaney Dec 07 '15 at 02:32
  • What textbook is this? What is it a textbook for? (Differential Geometry will get a different answer than Calculus -- I've assumed Calculus because of the "calculus" tag.) How are they defined? In what (likely specific) context are you manipulating differentials? – Eric Towers Dec 07 '15 at 02:44
  • 1
    You can also assume calculus because the title of my question says "calculus textbook." Yes, it is calculus. I'm sorry but I don't know the actual name of the book because I have it in pdf format. It's Calculus 1.

    My question begins by telling you how differentials are defined. $dx$ is any nonzero real number, and $dy$ is $dx$ multiplied by the derivative of the function in question.

    I'm not manipulating differentials in any context because, like my question states, I'm apparently not allowed to manipulate them. My question is asking why that is.

    – Kyle Delaney Dec 08 '15 at 18:01

1 Answers1

0

First, $\mathrm{d}x$ is not defined. The limit of the difference quotient is what is defined: $\frac{\mathrm{d}y}{\mathrm{d}x} = \lim_{h \rightarrow 0} \frac{y(x+h) - y(x)}{(x+h)-x}$. Pretending that the parts of the expression $\frac{\mathrm{d}y}{\mathrm{d}x}$ are individually defined makes as much sense as treating the "$\mathrm{i}$" in "$\lim$" as if it were a separable thing. Pretending that you can cancel parts of these expressions makes as much sense as the "derivation": $\frac{\sin x}{n} = \frac{\mathrm{si}\not\mathrm{n} x}{\not n} = \mathrm{six} = 6$.

Note that in the definition, the "$y$" appearing in the numerator is treated as a function of the "$x$" appearing in the denominator and the "$x$" is treated as an independent variable. So in $\frac{\mathrm{d}y}{\mathrm{d}u}$, $u$ is an independent variable, but in $\frac{\mathrm{d}u}{\mathrm{d}x}$, $u$ depends on $x$. These are not the same $u$, in spite of the fact that we write the same thing for them. So even if we were to somehow change all the definitions so that the sequences of symbols $\mathrm{d}y$, $\mathrm{d}x$, and $\mathrm{d}u$ had independent existence, one would still need a big theorem to justify cancelling the $\mathrm{d}u$s. The other way to correct this is to realize that $y(u)$ is really $y(u(x))$ and apply the chain rule, yielding $y'(u(x)) u'(x)$, which should look familiar. (It's the same expression as $\frac{\mathrm{d}y}{\mathrm{d}u} \frac{\mathrm{d}u}{\mathrm{d}x}$, but in other notation.) Here it's more clear that it's crazy to "cancel the $u$s".

Eric Towers
  • 67,037
  • 2
    You say $dx$ is not defined, but my textbook defines it. Are you saying the textbook is wrong? Is it possible that you're just not familiar with this usage? There's a whole section about using both $dx$ and $dy$ independently of each other, and I'm trying to fully understand it. – Kyle Delaney Dec 07 '15 at 02:37
  • @KyleDelaney : You don't give any keywords or context for understanding in what way your textbook has defined these differentials. It is typical to pretend that these objects exist when performing implicit differentiation. But even then, this should come with the warning that one is writing "bogus stuff" until valid expressions are formed. For example, the inference $x = y \implies \mathrm{d}x = \mathrm{d}y$ is incomplete until you finish with $\frac{\mathrm{d}y}{\mathrm{d}x} = 1$, converting to the properly defined symbols. – Eric Towers Dec 07 '15 at 02:42