Given that a function $f$ has a continuous second derivative on the interval $[0,1]$, $f(0)=f(1)=0$, and $|f''(x)|\leq 1$, show that $$\left|\int_{0}^{1}f(x)\,dx\right|\leq \frac{1}{12}\,.$$
My attempt: This looks to be a maximization/minimization problem. Since the largest value $f''(x)$ can take on is $1$, then the first case will be to assume $f''(x)=1$. This is because it is the maximum concavity and covers the most amount of area from $[0,1]$ while still maintaining the given conditions.
Edit: Because of the MVT and Rolle's Theorem, there exists extrema on the interval $[0,1]$ satisfying $f'(c)=0$ for some $c\in[0,1]$. These extrema could occur at endpoints.
Then $f'(x)=x+b$ and $f(x)=\frac{x^2}{2}+bx+c$. Since $f(0)=0$, then $c=0$ and $f(1)=0$, then $b=-\frac{1}{2}$. Remark: Any function with a continuous, constant second derivative will be of the form $ax^2+bx+c$ and in this case, $a=-b$ and $c=0$. Now, $$\begin{align*}\int_{0}^{1}f(x)\,dx&=\frac{1}{2}\int_{0}^{1}(x^2-x)\,dx\\&=\frac{1}{2}\bigg[\frac{x^3}{3}-\frac{x^2}{2}\bigg]_{x=0}^{x=1}\\&=-\frac{1}{12}\end{align*}$$
Next, we assume that $f''(x)=-1$ and repeating the process yields $$ \begin{align*}\int_{0}^{1}f(x)\,dx&=\frac{1}{2}\int_{0}^{1}(-x^2+x)\,dx\\&=\frac{1}{2}\bigg[\frac{-x^3}{3}+\frac{x^2}{2}\bigg]_{x=0}^{x=1}\\&=\frac{1}{12}\end{align*}$$ Thus we have shown that at the upper and lower bounds for $f''(x)$ that $\frac{-1}{12}\leq\int_{0}^{1}f(x)\,dx\leq \frac{1}{12} \Longleftrightarrow \left|\int_{0}^{1}f(x)\,dx\right|\leq\frac{1}{12}$ because $f''(x)$ is continuous on $[0,1]$.
I was wondering if this was 'rigorous' enough to be considered a full proof and solution to the problem.