2

I need to prove that if $f \in L^{1}(\mathbb{R})$ then $\int |f(x+t)-f(x)|dx \to 0$ as $t \to 0$. I thought about approximating f with simple or continuous functions but then realized I couldn't apply any of the standard convergence theorems directly because they're stated for sequences. I can prove $\int |f(x+\frac{1}{n})-f(x)|dx \to 0$ as $n \to \infty$ but I'm not sure how this would imply the desired statement. Are there generalized versions of the convergence theorems that apply to non-discrete indexing parameters?

1 Answers1

4

To elaborate on Jonas Teuwen's comment, recall the following fact about real functions:

$f(t) \to b$ as $t \to a$ if and only if, for every sequence $t_n$ with $t_n \to a$, we have $f(t_n) \to b$ as $n \to \infty$.

So let $t_n$ be an arbitrary sequence with $t_n \to 0$. Show that $\int |f(x + t_n) - f(x)|dx \to 0$ as $n \to \infty$ (which is probably the same proof as your special case $t_n = 1/n$). Then by the above fact, you are done.

Nate Eldredge
  • 97,710