9

If we differentiate the function, we get $$f'(x)=1+\cos(x)$$ Hence, $f'(x)$ varies from $0$ to $2$.

So, I think it is a one to one function because the function is never decreasing, and the function never becomes consecutively constant for more than one point.

But how do I prove that $f(x)$ is never strictly $0$ in an interval?

DeepSea
  • 77,651
Yash Swaraj
  • 107
  • 1
  • 9

5 Answers5

1

Assume $f(a)=f(b)$ with $a<b$. We know from the Intermediate Value Theorem that $f(b)-f(a)=(b-a)f'(c)$ for some $c\in(a,b)$. With $f'\ge 0$, this only tells us that $f(b)\ge f(a)$. But we know that $f'$ does not vanish on whole intervals, in particular, for sufficiently small $h$, we will have $f'(x)\ne 0$ for all $x\in(a,a+h)$ (even in the case that $f'(a)=0$). Then the argument of above gives us $f(b)- f(a+h)\ge 0$ (as we may pick $h\le b-a$) and $f(a+h)-f(a)=hf'(c)$ with $c\in (a,a+h)$, but this time we know $f'(c)>0$ and thus $$f(b)\ge f(a+h)>f(a).$$

1

A direct way:

Let

$$x+\sin x=y+\sin y$$ with $x\ne y$.

Then using the sum-to-product formula $$x-y+\sin x-\sin y=0,$$

$$x-y+2\cos\frac{x+y}2\sin\frac{x-y}2=0,$$

$$\cos\frac{x+y}2\frac{\sin\dfrac{x-y}2}{\dfrac{x-y}2}=\cos\frac{x+y}2\text{ sinc }\dfrac{x-y}2=-1.$$

The function on the right is the cardinal sine, which is less than $1$ for a nonzero argument, so that reaching $-1$ is impossible and the function is one-to-one.

  • This is unreadable without parentheses... – Najib Idrissi May 11 '17 at 07:19
  • @NajibIdrissi: I though so too in the past but I have changed my mind. I have now switched to a different style, where the argument of the function is the atomic expression that follows. This is actually very common. If the argument isn't atomic, then I add parenthesis. –  May 11 '17 at 07:22
1

I'm assuming you meant:

How do I prove that there is no interval of non-zero measure where $f'(x) = 0$ across the whole interval?

As you say, $f'(x) = 1 + \cos(x)$. The simplest solution is to find all the places that $f'(x) = 0$ and show it's a set of isolated points.

$$ 1 + \cos(x) = 0 \\ \cos(x) = -1 \\ x \in \{(2z+1)\pi \space | \space z \in \mathbb{Z} \} $$

Clearly no subset of this solution set will form an interval (if you want to be really thorough, you can prove it by showing that any two distinct elements are at least $2\pi$ apart from one another).

Philip C
  • 331
0

$f(x)$ is continuous, $f'(x)\geq 0$ and it is never strictly equal to zero for any interval,So it's one-one! Yes you are right!

Formally writing it now: Say it's not one-one so say $x\neq y$ but $f(x)=f(y)$.

w.l.o.g assume $x<y$

Since $f'(x)$ is zero at only discrete points then there is an interval contained in the interval $(x,y)$ in which $f'(x)$ is strictly greater than zero, so $f(x)$ strictly increases in that interval, also we know $f(x)$ never decreases, hence $f(x)$ is of course greater than $f(y)$ as $x<y$, hence a contradiction.

Yash Swaraj
  • 107
  • 1
  • 9
Arpan1729
  • 3,414
0

Yes, you're right: since $f'(x)\geq 0$, it's continuous and doesn't stay at $0$ for any interval, the function is strictly increasing and so injective.

We can see this by writing $f(b)-f(a)=\int_a^bf'(x)\mathrm dx$ (where $b>a$). If $f'(x)>0$ in $[a,b]$ then (because the derivative is continuous) it has some minimum value $\delta>0$ in that interval, and you get $f(b)-f(a)>\delta(b-a)>0$.

If $f'(x)=0$ somewhere on that interval, just pick any $c,d$ with $a<c<d<b$ such that $f'(x)>0$ on $[c,d]$. Splitting up into three integrals you get $f(a)\leq f(c)<f(d)\leq f(b)$, so $f(a)<f(b)$.

  • Please explain how it is strictly increasing if it equals zero at a point. – Yash Swaraj May 11 '17 at 06:57
  • If you have $a$ and $b$ either side of a point at which it is zero, you can choose some $c,d$ with $a<c<d<b$ so that there are no zeroes in $[a,c]$ or $[d,b]$. Then you can say it is strictly increasing between $a$ and $c$, and between $d$ and $b$, and it certainly doesn't decrease in the middle, so it is strictly larger at $b$ than $a$. – Especially Lime May 11 '17 at 07:10
  • @yashswaraj: yes, it is, but only because it equals zero at exactly one point. Roughly speaking, strictly increasing means that if $a > b$, then $f(a) > f(b)$. Let $c$ be the point at which $f'(c) = 0$. Then apply MVT between $f(c)$ and $f(c+h)$ for any $h > 0$ to see that $f(c + h) > f(c)$, implying that it is strictly increasing – infinitylord May 11 '17 at 07:20