How do we prove the right Riemann sum error bound? In wikipedia (https://en.wikipedia.org/wiki/Riemann_sum#Right_Riemann_sum) they have mentioned the following bound, but no proof. $$\left | \int_a^bf(x)dx - A_{\mbox{right}}\right | \leq \frac{M_1(b-a)^2}{2n}$$
2 Answers
This could be an exercise when learning the definite integrals. Here is a demo.
For convenience, suppose $f$ is differentiable on $[a,b]$. Let $x_j = a + j \varDelta x$, where $\varDelta x = (b-a)/n$, for $j =1,2,\dots, n$. Then $$ A _{\text{right}} = \sum_1^n f(x_j) \varDelta x. $$ Thus \begin{align*} \newcommand\Abs[1]{\left\vert #1 \right \vert} &\quad \Abs {-\int_a^b f(x) \,\mathrm dx + A_{\text{right}} } \\ &= \Abs {-\int_a^b f(x) \,\mathrm dx + \sum_1^n f(x_j) (x_{j} - x_{j-1})} \\ &= \Abs {-\sum_1^n \int_{x_{j-1}}^{x_j} f(x) \,\mathrm dx + \sum_1^n \int_{x_{j-1}}^{x_j}f(x_j) \,\mathrm dx } \\ &= \Abs {\sum_1^n \int_{x_{j-1}}^{x_j} (f(x_j) - f(x) )\,\mathrm dx } \\ &= \Abs { \sum_1^n \int_{x_{j-1}}^{x_j} f'(\eta_j) (x_j - x)\mathrm dx } [\text{Lagrange's MVT}], \end{align*} where $\eta_j \in (x_{j-1}, x_j)$ for each $j$.
Now use $-M_1 \leqslant f' \leqslant M_1$, and notice that $(x_j - x) \geqslant 0$, we have $$ -M_1 (x_j - x) \leqslant f'(\eta_j)(x_j -x) \leqslant M_1 (x_j -x), $$ hence $$ -M_1 \int_{x_{j-1}}^{x_j} (x_j - x) \mathrm dx \leqslant \int_{x_{j-1}}^{x_j} f'(\eta_j) (x_j - x)\mathrm dx = \int_{x_{j-1}}^{x_j} (f(x_j) - f(x))\mathrm dx \leqslant M_1 \int_{x_{j-1}}^{x_j} (x_j - x) \mathrm dx, $$ i.e. $$ \Abs { \int_{x_{j-1}}^{x_j} (f(x_j) - f(x))\mathrm dx } \leqslant M_1 \int_{x_{j-1}}^{x_j} (x_j -x)\mathrm dx = M_1 \frac {(b-a)^2}{2n^2}. $$ Therefore we have $$ \Abs {-\int_a^b f(x) \,\mathrm dx + A_{\text{right}} } \leqslant \sum_1^n \Abs { \int_{x_{j-1}}^{x_j} (f(x_j) - f(x))\mathrm dx } \leqslant n \cdot M_1 \frac {(b-a)^2}{2n^2} = \frac {M_1(b-a)^2}{2n}. $$

- 8,835
-
your answer looks good but can you tell me how $-\int_a^bf\left(x\right)dx+A_{Right}$ is equal to $\sum_{j=1}^n\int_{x_{i-1}}^{x_i}\left(f\left(x_i\right)-f\left(x\right)\right)dx$? – Aug 28 '19 at 15:54
-
@Grothendieck I've edited. How about now? – xbh Aug 28 '19 at 16:03
-
does $\sum_{j=1}^n\int_{x_{j-1}}^{x_j}f\left(x_j\right)dx$ mean the sum of all areas under the rectangles with width $x_j-x_{j-1}$ and the height $f(x_j)$? – Aug 28 '19 at 16:19
-
@Grothendieck Yes, that's it. – xbh Aug 28 '19 at 16:20
-
I got it, another question is that how $M_1\int_{x_{j-1}}^{x_j}\left(x_j-x\right)dx$ implies $M_1\ \frac{\left(b-a\right)^2}{2n^2}$? – Aug 28 '19 at 16:24
-
@Grothendieck Direct computation of the integral by the fundamental theorem of Calculus. – xbh Aug 28 '19 at 16:26
This result is stated for a monotone function and uniformly spaced points. Assume first that $f$ is non-decreasing ( the proof for $f$ non-increasing is similar).
With a uniform partition $x_j = a + j\frac{b-a}{n}$, we can apply the mean value theorem to obtain
$$f(x_j) = f(x) + f'(\xi_j(x))(x_j - x)$$
where $\xi_j(x)$ is between $x$ and $x_j$.
Integrating and using $f' \geqslant 0$, we get the local error bound
$$f(x_j)(x_j - x_{j-1}) - \int_{x_{j-1}}^{x_j} f(x) \, dx = \int_{x_{j-1}}^{x_j} f'(\xi_j(x))(x_j - x) \, dx \leqslant M \frac{(x_j - x_{j-1})^2}{2} = M \frac{(b-a)^2}{2n^2}$$
and the global error bound is
$$\sum_{j=1}^nf(x_j)(x_j - x_{j-1}) - \int_{a}^{b} f(x) \, dx = \sum_{j=1}^nf(x_j)(x_j - x_{j-1}) - \sum_{j=1}^n\int_{x_{j-1}}^{x_j} f(x) \, dx \\ \leqslant \sum_{j=1}^n M \frac{(b-a)^2}{2n^2} = M \frac{(b-a)^2}{2n} $$

- 90,707
-
Thanks for the explanation. One doubt though, in the taylor linear approximation of f(xj) whats the domain of x? – jnxd Jan 21 '19 at 23:24
-
I might be less confusing if I refer to the justification as the mean value theorem -- which is exact for any $x, x_j \in [a,b]$ assuming $f$ is differentiable in $[a,b]$. I'm sure you know that $f(x) - f(y) = f'(c)(x-y)$ where $c$ is between $x$ and $y$. – RRL Jan 22 '19 at 00:36
-