I'm trying to formally prove this:
Let $P=\{a=x_0<x_1<...<x_n=b\}$ be a a partition of $[a,b]$ which divides $[a,b]$ into $n$ equal sub intervals like so:
$x_i=a+\frac{i}{n}(b-a)\ \ \forall i\in\{0,1,...,n\}$
if:
$|U_{f,p}-L_{f,p}|<\epsilon\ \ $ (where $U_{f,p},L_{f,p}$ are Darboux upper\lower sums)
Then any partition $\tilde{P}$ of $[a,b]$ with $\lambda(\tilde{P})<\lambda(P)$ (where $\lambda(P)$ is the longest sub interval in $P$)
will also give:
$|U_{f,\tilde{p}}-L_{f,\tilde{p}}|<\epsilon\ \ $
This makes sense because any different partition must have more division points and that would make $|U_{f,p}-L_{f,p}|$ even smaller, but I'm having trouble showing it.
Can anyone give me any suggestions on this?