Trying to compare one-norm, two-norm and infinity norm over C[0,1]. We are given the following:
For f $\in C[0,1]:$
$||f||_1 = \int\limits_0^1 |f(t)|dt, \hspace{2.5cm} ||f||_\infty = max_{t\in[0,1]} | f(t)|, \hspace{2.5cm} ||f||_2 = \sqrt{\int\limits_0^1 f(t)^2dt}$
The question is how can we prove that $ ||f||_1 \leq ||f||_2 \leq ||f||_\infty $ for all $f \in C[0,1]$?
It makes sense to me that, since the one norm gives the area between the f(t) curve and the t axis, and the infinity norm gives the maximum distance between f and the t-axis, that $ ||f||_1 \leq ||f||_\infty $ over [0,1]. But the relationship of the two with $ ||f||_2 $ is what we really need to compare in this case, and it happens to be what I have failed to wrap my head around.