1

Am I correct to state that taking the derivative of $f(x+dx) - f(x)$ with respect to $x$ does not equal $df/dx$? Or in another words:

$$\frac{d}{dx}\big(f(x+dx) - f(x)\big) \neq \frac{df(x)}{dx}$$

I've reasoned this out by example. If we take $f(x) = x^2$, then:

$$\frac{d}{dx}\big(f(x+dx) - f(x)\big) = \frac{d}{dx}\left((x+dx)^2\right) - \frac{d}{dx}\left(x^2\right) = 2\,dx.$$

The only thing I'm uncertain of is that in the limit as $dx\to0$, the tangents at $x$ and $x+dx$ should be approximately the same (given that the function is smooth for all $x$).

gebruiker
  • 6,154
Kimusubi
  • 209
  • 2
    In standard analysis you can't treat $dx$ like it was a real number, so $f(x+dx)$ is a meaningless term. – Asaf Karagila May 16 '13 at 19:25
  • read some basic of differential then you will see this is a meaningless and needless question.and as @AsafKaragila said dx is not a real number – iostream007 May 16 '13 at 19:29
  • 2
    @Kimusubi have a look at Arturo Magidin's brilliant answer to this question. – jkn May 16 '13 at 19:30
  • @jkn I am shocked what is the voteup to the answer.really awesome one – iostream007 May 16 '13 at 19:32
  • @iostream007 I'm not surprised, I have spent $\pm$ half a decade around engineering (and a bit) math departments and I have never heard anyone give such clear and informative discussion about $\frac{dy}{dx}$. – jkn May 16 '13 at 19:36
  • I'm shocked from the response for the answer obviously it worth this.I left practicing math around 7 years ago but still I remember what I have taught and what concept,history I knew. – iostream007 May 16 '13 at 19:40

1 Answers1

1

Well, I assume you're working with standard analysis. In this context $dx$ is meaningless unless you define it as one kind of object called differential form, and I'll not talk about these objects now to avoid more confusion. You just have to know that treating $dx$ as "infinitesimal amount added to $x$" is not allowed in standard analysis greatly because this brings many troubles with it. The framework that allows this kind of reasoning is non-standard analysis that many people really like. For example it's easy to see that if you try defining $dx$ as $\Delta x \to 0$ you'll fail drastically because by the usual definition of limit, the limit of $\Delta x$ when $\Delta x$ goes to zero is zero.

I understand your preocupation because many people insist in teaching calculus using those objects, and that is an approach that confuses people very much. My suggestion is that you pick a book like Apostol's Calculus Vol. 1 or Spivak's Calculus (personally I prefer Spivak's book). They will teach you how to deal with calculus without recoursing to infinitesimals, and after you see it you'll understand why people moved from this thought to the modern you: the modern one is just clearer and simpler.

Good luck with your studies!

Gold
  • 26,547