3

I’ve been wondering about the usage of $dy=f’(x)dx$ in my textbook.

There’s not a single justification of how it is proved and it just states that it is true.

Since $dy/dx$ can’t be assumed as a fraction, I’m guessing there’s more to it than just multiplying by $dx$ on both sides.

Are there any proofs to this equation?

Also with some research, I found this “proof”. Can it be done this way?

$dy=f’(x)dx$ “proof”

(Please keep this in high school level)

Javi
  • 6,263
Jwnle
  • 51
  • 1
  • 8
  • It's merely a symbolic notation, used to simplify some expressions. If you will, just take $dy=f'(x)dx$ as the definition of the symbols $dy, dx$. Note that these (at least for now) are no real mathematical objects (in the sense that they are rigorously defined), and just serve to make some stuff a bit tidier. – Jannik Pitt Aug 08 '18 at 13:12
  • Which textbook? – Chappers Aug 08 '18 at 13:17
  • It’s not an english textbook, so you wouldn’t know even if I told you... sorry... – Jwnle Aug 08 '18 at 13:35
  • One of many heuristic ways of understanding this result, specifically in the context of integration, is described in my answer https://math.stackexchange.com/questions/402303/understanding-the-differential-dx-when-doing-u-substitution/402346#402346 - working through this may help you to understand why it works for changes of variable in integrals. – not all wrong Aug 08 '18 at 13:46
  • There is a sense that something is true until your notation is defined. What does $dy$ mean? Unfortunately, it takes a lot of details to define what we mean by $dy,$ and then after that, it is basically a direct result of the chain rule or something like it. But it's almost all in the definition, which is dry. – Thomas Andrews Aug 08 '18 at 14:07
  • Also, presumably the assumption here is that $y=f(x)$ where $f(x)$ is differentiable. – Thomas Andrews Aug 08 '18 at 14:10
  • I understand and sympathize. This is a part of the calculus curriculum that bothers me. It may be unavoidable. – Randall Aug 08 '18 at 14:59

3 Answers3

5

The proof you linked is no better than the proof using fraction notation.

There's two points of view about these matters.

One is the point of view taught in a first calculus course: calculation of derivatives and integrals using these symbolic methods leads to correct answers (which is not hard to prove even in a first calculus course), so learn how to do it.

Another point of view is what is taught in a more advanced course, namely differential geometry: the symbols $dx$ and $dy$ and $f'(x) \, dx$ indeed do have independent mathematical meaning, they are examples of differential forms, and if $y=f(x)$ then $dy = f'(x) dx$ can be proved as a consequence of theorems about differential forms.

Lee Mosher
  • 120,280
  • 1
    How would I approach the proof in the point of view taught in a first calculus course? Are there any useful resources I can look up if you know any? :) – Jwnle Aug 08 '18 at 13:33
  • 1
    To clarify, the point of view I'm explaining is that there is no proof of $dy = f'(x) , dx$ in a first calculus course. Instead, this equation represents a step in a symbolic calculation, which calculus students should learn to master. – Lee Mosher Aug 08 '18 at 13:40
  • One reason that calculus students can learn these calculations is by using their intuition for fractions. So, for example, their intuition for fractions allows them to go easily from $\frac{dy}{dx} = f'(x)$ to $dy = f'(x) , dx$. – Lee Mosher Aug 08 '18 at 13:43
1

They key to the question is: What do we mean by $dx$ and $dy$, specifically?

In the end, the definition is the thing, and a real definition is more complicated that one would like.

More generally, if $y=f(x)$ with $f$ differentiable, and $g$ is any "nice" function then:

$$\int_{f(a)}^{f(b)} g(y)\,dy = \int_{a}^{b} g(f(x))f'(x)\,dx\tag{1}$$

when the right side is defined.

This is basically saying we can do substitution to compute $\int_{a}^{b} g(f(x))f'(x)\,dx.$

When $g(y)=1$ for all $y$ this means:

$$\int_{f(a)}^{f(b)} dy = \int_{a}^{b} f'(x)\,dx$$

Unfortunately, this doesn't quite look like $dy=f'(x)\,dx$ because the ranges are different.

In intro calculus, you can probably prove (1). When you define the more general Riemann-Stieltjes intregral, there is a notion of $df$ for integrals. and you'll get:

$$\int_a^b h(x)\,df = \int_a^b h(x)f'(x)\,dx$$ when $f$ is continuously differentiable.

There are more advanced notions, like differential forms, but those are specifically defined just to have this and other properties that you'd expect if you wanted to treat $dx$ and $dy$ as algebraic "things."

In the differentiable form view, we have, for any interval $[a,b]$, a curve in 2D space, $C_{a,b}$, the set of points $(x,f(x))$ for $x\in[a,b].$ Then $dy$ is a form on that curve, and $f'(x)dx$ is another form, and both forms evaluate to the same thing on this curve.

This makes clear that this has to do with a curve in 2-dimensional space, and that is why the ranges in (1) seem to shift - we are actually looking at the boundary points on the curves, $(a,f(a))$ and $(b,f(b)).$

But that view is way above intro calculus.

Thomas Andrews
  • 177,126
  • But isn’t the integral of [f(a), f(b)] respect to y, same with the integral of [a, b] respect to x, therefore making dy=f’(x)dx true? – Jwnle Aug 08 '18 at 17:09
  • We haven't given a definition of what $dy$ and $f'(x)dx$ are, so I'm not sure if you can. If $u=v$ then anything you do to $u$ is equivalent to what you do to $v.$ My point is that it looks bad, but you can clarify the "why" by envisioning it as instead an operation on the curve $(x,f(x)).$ – Thomas Andrews Aug 08 '18 at 20:40
0

Well the derivative is given by: $$\lim_{dx \to 0} \frac{f(x+dx)-f(x)}{dx}=\lim_{dx\to 0} \frac{dy}{dx}$$ By definition the derivative is the rate of change of y with regard to x. That's why RHS stands. As you realise $\frac{dy}{dx}$ is not just a notation but it's mathematically how derivative is been defined. Since $f'(x)=\frac{dy}{dx}$ with $dx\to 0$, the equation $dy=f'(x)dx$ holds.

  • Except $dx$ isn't really a number, it is a notation, and $\frac{dy}{dx}$ is not really a fraction, it is also notation. (That is usually why we write the definition of $\frac{dy}{dx}$ as $\lim_{\Delta x\to 0}.$ While this is the intuition for why $dy=f'(x),dx,$ it isn't proof. It's not even clear what $dy=f'(x),dx$ means until we get good definitions of this. – Thomas Andrews Aug 08 '18 at 14:50
  • @ThomasAndrews what I mean by $dx$ is an infinitesimal change to $x$. Maybe I should have written $Δx$ to my answer instead of $dx$ in order to avoid the confusion. – Anastassis Kapetanakis Aug 08 '18 at 15:00
  • But that isn't what $dx$ means in classical analysis (including standard calculus.) There is no such thing as an "infinitesimal change." If $dx$ is an infinitesimal, you wouldn't need $\lim_{dx\to 0}.$ – Thomas Andrews Aug 08 '18 at 20:36
  • @ThomasAndrews By $lim_{dx\to 0}$ I am trying to emphasize the fact that $dx$ is infinitesimal. This limit is what makes $dx$ infinitesimal. I am just trying to point out that the notation of the derivative is not just a notation but something that has meaning in mathematics. It's a fraction that follows the standard rules. But as I can understand, what you are saying is that I use symbols that by definition mean something special. Well that may be a problem of mine: I am trying to give a more intuitive explanation of what's going on with the derivative and it's notation without knowing the sp – Anastassis Kapetanakis Aug 08 '18 at 21:30
  • It does not have that meaning in standard calculus, because there is no such thing as an infinitesimal in the standard real line. The OP is looking for a proof, not an intuition. – Thomas Andrews Aug 08 '18 at 22:20
  • @ThomasAndrews Ok I understand! – Anastassis Kapetanakis Aug 09 '18 at 05:55