Let $f \colon \mathbb{R} \to \mathbb{R}$ be an infinitely differentiable function and suppose that for some $ n ≥ 1$ - $f(1)=f(0)=f'(0)=f''(0)=f'''(0)=\dots=f^{(n)}(0)=0$,
I have to prove that there exists $x \in (0,1)$ such that $f^{(n+1)}(x)=0$.
Here is what I did:
Since $f(1)=f(0)$ Mean-value theorem says that there exists at least one $x \in (0,1)$, say $x_1$, such that $f^{(1)}(x_1)=0$
With that $x_1$ if we again apply MVT , on $f^{(1)}(x_1)=f'(0)=0$ we can say that there exists atleast one $x \in (0,1) \,\,\text{say } x_2$ such that $f^{(2)}(x_2)=0$
Repeating the above process, we will get a $x_n \in (0,1)$ such that $f^{(n+1)}(x_n)=0$.
Here is the catch. In the question mentioned above, it says that $f$ is only differentiable, and I assumed it to be continuous in order to apply MVT.
I cannot think of any other way to prove this.