3

I know that $f'(a)>0$ means $f$ is increasing in some open interval for the case $f$ is in class $C^1$.

pf) Choose $\epsilon := f'(a)/2$. For $f$ is in class $C^1$, $f'$ is continuous. By the definition of continuity, $\exists \delta >0, \forall x : |x-a|<\delta \Rightarrow |f'(x) - f'(a) | < \epsilon$. The last inequality with triangle inequality yields $f'(x)>0$ in some open interval and the proof is done.

The above statement and the proof is restricted on the class $C^1$, but I think it is not restricted, it is free from $C^1$ that $f'(a)>0$ means $f$ is increasing in some open interval.

How to prove it?

cokecokecoke
  • 1,195

3 Answers3

7

At first, when learning a mathematical concept, there is likely to be a bit of a disconnect between your intuition and the definition. Pursuing questions like this helps a lot. Here you have answers that provide a wiggly function for a counterexample and a closer look at the proof for a continuous derivative.

Here is another viewpoint. When you think of $f'(a)>0$ you commonly imagine all the difference quotients $$\frac{f(y)-f(x)}{y-x} $$ as positive for intervals $[x,y]$ close to $a$. But the example here should remind you that $f'(a)>0$ only pays attention to the intervals $[x,y]$ that straddle $a$, i.e., intervals with $x \leq a \leq y$. Not all intervals close.

The Italian mathematician Guiseppe Peano had a different response to this situation. He felt that the ordinary derivative didn't convey the right intuitive idea. Why shouldn't a positive "derivative" imply increasing close to the point? He introduced what he called a strict derivative (nowadays called a strong derivative or unstraddled derivative by most people). Define $f^\sharp(a)$ to be the limit $$ f^\sharp(a) = \lim_{x,y\to a, x\not=y} \frac{f(y)-f(x)}{y-x} $$ where now all intervals $[x,y]$ are being considered. For this stronger derivative the answer to your question is positive: if $f^\sharp(a)>0$ then, indeed, $f$ is increasing in some neighborhood of the point $a$. So you are right, but you were depending on the wrong derivative to supply your conclusion.

The original reference is

Giuseppe Peano, Sur la definition de la derivee, Mathesis Recueil Mathematique (2) 2 (1892), 12-14.

Peano hoped that the strict derivative would be useful in teaching calculus, but that hasn't worked out. You can, however, find many papers that study this derivative. It is a good introduction to the idea that there are numerous useful ways of defining a derivative other than the traditional calculus one.

5

A classic counter-example is $f(x)=x+x^2\sin(1/x^2)$ if $x\not = 0$, and $f(0)=0$. It is easy to see that $f$ is $C^1$ on $\mathbb{R}-\{0\}$, with derivative $\displaystyle f^{\prime}(x)=1+2x\sin(1/x^2)-\frac{2}{x}\cos(1/x^2)$ and has for derivative $1$ at $0$. If $f$ is increasing in an interval $]-a,a[$, with $a>0$, then $f^{\prime}$ must be $\geq 0$ on such an interval, and we can easily see that this is not true.(Take $\displaystyle x_n=1/\sqrt{n\pi}$ for $n$ large and even).

Kelenner
  • 18,734
  • 26
  • 36
1

The problem with dropping continuity of $f'$ can be found in the proof. If you drop it you have not what is reqiuered to show that $f'(x)>0$ in an neighborhood of $a$.

Supposed we know that we mustn't have $f'(x)<0$ somewhere in the neighborhood (or an interval where $f'(x)=0$) we need to come up with an example where $f'(x)<0$ arbitrarily close to $a$. For example $f(x)=x+x^2(\sin1/x^2)$, but $f(0)=0$ is one - the exercise is to show that $f'(0) = 1$, and that it has negative derivates arbitrarily close to $0$.

skyking
  • 16,654