4

Let $f:[0,1] \to \mathbb R$ be a smooth function, and suppose that $f(x) > f(0)$ for every $0< x \le 1$.

Is it true that $f' \ge 0$ in some neighbourhood of $0$?

$f'(0) \ge 0$, and by the mean value theorem $$ f'(c(x))=\frac{f(x)-f(0)}{x-0}> 0,$$

where $0<c(x)<x$. In particular, by taking $x$ to zero, we can construct a sequences $x_n \to 0$ satisfying $f'(x_n) >0$. I am not sure how to proceed from here.

Is there some pathological counter-example?

Asaf Shachar
  • 25,111
  • I guess a mixture of the two answers is what you want. You need something like $e^{-1/x^2}$ at the front to make sure its still a smooth function, and then you multiply by like $2+\sin1/x$ or maybe $e^{-1/x^2} + \sin^2 (1/x)$ or variants – Calvin Khor May 15 '20 at 05:54
  • (for people from the future: the "two answers" I meant were the first two wrong answers, Kavi's nonsmooth example and Ted's smooth example that took the value 0 infinitely often as you approached the origin) – Calvin Khor May 15 '20 at 08:41
  • @AsafShachar : Could you please specify what exactly you mean by "smooth"? Judging from your try I thought you mean only having a continuous derivative and not a higher degree of smoothness. – trancelocation May 15 '20 at 08:59
  • @trancelocation Indeed, by smooth I mean "infinitely differentiable"-that is having derivatives of all orders. I think this is the standard terminology. – Asaf Shachar May 15 '20 at 09:20
  • Thanks for the clarification. Will delete my post and maybe adjust later. – trancelocation May 15 '20 at 09:22

2 Answers2

7

While you can probably come up with some explicit combination of exponential and trigonometric functions that is a counterexample, I find it much more enlightening to instead just cobble one together with bump functions.

Start with a smooth function $\varphi:[0,1]\to\mathbb{R}$ which is identically $0$ in neighborhoods of $0$ and $1$, nonnegative on $[0,1/2]$, nonpositive on $[1/2,1]$ (and negative somewhere) and has positive integral. (So, it jumps up to positive values somewhere in the middle of $[0,1/2]$, jumps down to negative values somewhere in $[1/2,1]$, and the positive values have a larger integral than the negative values.)

Now pick a shrinking sequence of disjoint intervals $[a_n,b_n]$ approaching $0$ and consider a function $g:[0,1]\to\mathbb{R}$ which is $0$ except on the intervals $[a_n,b_n]$, and on each $[a_n,b_n]$ is given by $g(x)=c_n\varphi(\frac{x-a_n}{b_n-a_n})$ for some $c_n>0$. If we pick the coefficients $c_n$ to shrink fast enough, then all the derivatives $g^{(k)}(x)$ will approach $0$ as $x\to 0$ and so $g$ will be smooth even at $0$.

Finally, define $f(x)=\int_0^xg(t)\,dt$. Then $f$ is smooth because $g$ is. Also, $f(x)>0=f(0)$ for all $x>0$, by our choice of $\varphi$, since the integral of $\varphi$ is positive and moreover the integral of $\varphi$ over $[0,s]$ is still nonnegative for any $s\in [0,1]$ (so if $x$ is in the middle of one of the intervals $[a_n,b_n]$, the integral of $g$ over the first part of that interval will not be negative). But $f'=g$ is negative on points of every interval $[a_n,b_n]$, and these points get arbitrarily close to $0$.

Eric Wofsey
  • 330,363
  • 1
    +1 suspected this would work – Calvin Khor May 15 '20 at 06:00
  • 1
    Thank you, this is a very interesting answer. I have two questions: (1) You just need to choose $c_n$ such that $\frac{c_n}{b_n-a_n} \to 0$, right? because every derivative of $\phi$ is bounded. (2) Does the smoothness of $g$ at zero follow from the fact that if the limit of a derivative exist, then the function is differentiable at the limit point-so the existence of $\lim_{x \to 0} g^{(k)}(x)=0$ implies that $g^{(k+1)}(0)$ exists, and is equal to $0$. – Asaf Shachar May 15 '20 at 14:19
  • 1
    @AsafShachar (1) You need $\frac{c_n}{(b_n-a_n)^k} \to 0$ for all $k$, since you get $k$ factors of $(b_n-a_n)^{-1}$ in the $k$th derivative. (2) That's correct. – Eric Wofsey May 15 '20 at 14:22
  • Thank you. This is a cool answer. I think that the moral of the story is this: "A function is not necessarily weakly increasing in any neighbourhood beyond a strict minimum". Integrating this observation leads to the phenomena of a strict minimum without convexity: https://math.stackexchange.com/questions/3677358/is-a-smooth-function-convex-near-a-strict-minimum. – Asaf Shachar May 16 '20 at 08:18
0

For completeness, let me note that one can also create an explicit example:

$$f(x)=\begin{cases}\left(\sin^2\left(\frac 1x\right)+e^{-\frac{1}{x^2}}\right)e^{-\frac{1}{|x|}}\ &x\neq 0\\0\ &x=0\end{cases}$$

Its derivative (with $x>0$) is

$$f'(x)=\left[e^{-\frac{1}{x^2}}\left(1+\frac{2}{x}\right)+\sin\left(\frac1x\right)\left(\sin\left(\frac{1}{x}\right)-2\cos\left(\frac1x\right)\right)\right]\frac{e^{-\frac{1}{x}}}{x^2}$$

As the first term inside the square brackets can be made as small as necessary while the second one oscillates infinitely often between $1$ and $\approx-0.61$, it is easy to see that the derivative has infinite negative values around $0$.