0

Refer to this question (answer of Rick) :Here

The question let $f:[r,s]\longrightarrow \mathbb R$ continuous on $[r,s]$ and differentiable on $(r,s)$. Suppose $f'(a)<0<f'(b)$. Show that there is a $c$ s.t. $f'(c)=0$.

In it's answer, Rick says that by continuity there is a $\delta>0$ s.t. $[a,a-\delta]$ is decreasing. By Marc McClure, it's a wrong argument, but I can't find a counter example neither prove that the argument is correct. Does someone can give me a counter example or prove it ?

idm
  • 11,824
  • 3
    Think wiggles. That is, take some decreasing function, say $x\mapsto-x$, and perturb it by an oscillating function, small enough to leave the derivative unchanged but whose derivative takes nonvanishing values destroying the monotonicity. For example, $$f(x)=-x+2x^2\sin(1/x).$$ – Did Nov 25 '15 at 09:38
  • Any Weierstrass function will do. – Ittay Weiss Nov 25 '15 at 09:40
  • *At $a=0$, of course. – Did Nov 25 '15 at 09:44
  • @did: nice couter example :-) tks, – idm Nov 25 '15 at 09:49

0 Answers0