1

Claim to prove:

Suppose $f'(x) \geq 0$ for any $x$. Further, suppose that if $f'(x)=0$, then $x$ is an isolated zero of $f'$. Given these two conditions, $f$ must be a strictly increasing function.

We will make use of the following lemma:

If $f: \mathbb R \to \mathbb R$ and $f$ is locally strictly increasing for any $x \in \mathbb R$, then $f$ is a strictly increasing function. $\quad (\dagger)$


Let $a$ be any element where $f'(a) \gt 0 \iff \displaystyle \lim_{x \to a}\frac{f(x)-f(a)}{x-a} \gt 0$. This implies that there is some punctured neighborhood around $(a-\delta_a,a+\delta_a)\setminus \{a\}$ such that for any $x$ in this punctured neighborhood: $\frac{f(x)-f(a)}{x-a} \gt 0 \quad (*)$. Suppose that $x$ is to the left of $a$. Therefore, $x-a \lt 0$. In order to satisfy $(*)$, $f(x)-f(a) \lt 0$...i.e. $f(x) \lt f(a)$. Similarly, if $x$ is to the right of $a$, then $f(x) \gt f(a)$. Therefore, on $(a-\delta_a,a+\delta_a)$, $f$ is strictly increasing.

Next, suppose $a$ is any element where $f'(a)=0$. Because $a$ is an isolated zero of $f'$, there is some punctured neighborhood $(a-\delta_a,a+\delta_a)\setminus \{a\}$ such that for any $f'(x) \neq 0$. If $f'(x)\neq 0$, then, by assumption, $f'(x)\gt 0$. Therefore on the intervals $(a-\delta_a,a)$ and $(a,a+\delta_a)$ $f$ is necessarily strictly increasing. Continuity at $a$ allows us to stich these two intervals together: $f$ must be strictly increasing on $(a-\delta_a,a+\delta_a)$.

The above two paragraphs have demonstrated that for any $x \in \mathbb R$ there is a neighborhood around $x$ where $f$ is strictly increasing. Applying $(\dagger)$, we find that $f$ is a strictly increasing function. $\quad \square$


Any alternative approaches would be appreciated!

S.C.
  • 4,984

2 Answers2

2

Here's an alternative. Since $f'(x)\ge 0$, we know that $f$ is (weakly) increasing. Aiming for a contradiction, suppose $f$ is not strictly increasing. This means there are numbers $a<b$ such that $f(a)\ge f(b)$. But $f(a)\le f(b)$ since $f$ is increasing. Hence, $f(a)=f(b)$ and since $f$ is monotonic, $f(x)=f(a)=f(b)$ for all $x\in[a,b]$. So we found a non-trivial interval where $f$ is constant. Its derivative there is $0$. Contradiction.

bjorn93
  • 6,787
  • I assume by "its derivative there is $0$" you mean that, for any $x \in [a,b]: f'(x)=0$...thus contradicting the assumption that $f'$ only has isolated zeros. Nice. Cheers~ – S.C. Dec 10 '21 at 02:45
0

In the original proof, I claimed the following:

Let $a$ be any element where $f'(a) \gt 0 \iff \displaystyle \lim_{x \to a}\frac{f(x)-f(a)}{x-a} \gt 0$. This implies that there is some punctured neighborhood around $(a-\delta_a,a+\delta_a)\setminus \{a\}$ such that for any $x$ in this punctured neighborhood: $\frac{f(x)-f(a)}{x-a} \gt 0 \quad (*)$. Suppose that $x$ is to the left of $a$. Therefore, $x-a \lt 0$. In order to satisfy $(*)$, $f(x)-f(a) \lt 0$...i.e. $f(x) \lt f(a)$. Similarly, if $x$ is to the right of $a$, then $f(x) \gt f(a)$. Therefore, on $(a-\delta_a,a+\delta_a)$, $f$ is strictly increasing.

This is faulty reasoning. The above paragraph only shows that for some $\delta \gt 0: x \in (a-\delta,a) \rightarrow f(x) \lt f(a) \text{ and } x \in (a, a+\delta) \rightarrow f(x) \gt f(a) \quad \color{red}{(\dagger)}$

But the definition of strictly increasing on $(a-\delta,a+\delta)$ reads as:

For any $x_1 \lt x_2 \in (a-\delta,a+\delta): f(x_1) \lt f(x_2)$.

Consider if we let $x_1 \lt x_2$ be any two elements in $(a-\delta,a)$. Our claim $\color{red}{(\dagger)}$ does not give us any insight into the relationship between $f(x_1)$ and $f(x_2)$. For example, it could very well be the case that $f(x_1) \geq f(x_2)$, which prevents us from claiming that $f$ is strictly increasing over the interval.

However, given our two starting assumptions: Suppose $f'(x) \geq 0$ for any $x$. Further, suppose that if $f'(x)=0$, then $x$ is an isolated zero of $f'$...

we can show that $f(x_1) \geq f(x_2)$ is impossible.


By assumption, we know that $f$ is continuous and differentiable on $[a-\delta,a+\delta]$ with $f'(a) \gt 0$. Suppose for any $x_1 \lt x_2 \in (a-\delta,a)$: $f(x_1) \geq f(x_2)$. By the mean value theorem, we see that if $f(x_1) \gt f(x_2)$, then there is an $x^* \in (x_1,x_2)$ such that $f'(x^*) \lt 0$. This would violate our assumption that $f'(x) \geq 0$ for all $x$. So suppose $f(x_1) = f(x_2)$. Because we need all $x \in (x_1,x_2)$ to satisfy $f'(x) \not \lt 0$, the only way for $f(x_1)=f(x_2)$ is if the entire interval $[x_1,x_2]$ is a constant. But this violates our assumption of isolated zeros.

Therefore, for any $x_1 \lt x_2 \in (a-\delta,a): f(x_1) \lt f(x_2)$. A similar argument works for $x_1 \lt x_2 \in (a,a+\delta)$. Combining this result with $\color{red}{(\dagger)}$, we can conclude that there is a neighborhood around $a$ where $f$ is strictly increasing.

Importantly, we did not require $f'$ to be continuous at $a$ in order to demonstrate our claim. Consider, for example, the function $f(x)=\alpha\cdot x +x^2\sin(\frac{1}{x})$, and let $\alpha \gt 1$. This function's derivative is not continuous at $0$, and yet we can always find a small enough neighborhood such that the derivative is $\geq 0$.

S.C.
  • 4,984