I had this problem in my midterm exam the last week and still I can't handle it:
It is true that if $f \in L^{1}(\mathbb{R})$, (that is, $f: \mathbb{R} \rightarrow \mathbb{R}$ is Lebesgue measurable, $\int f d \lambda < \infty$, where $\lambda$ is the Lebesgue measure), and $a_{n} \subset \mathbb{R}$ is a sequence such that $a_n \rightarrow 0$, then $$ \lim_{n \rightarrow \infty} \int \mid f(x +a_n) -f(x) \mid d\lambda = 0 ?$$ This is trivially true if $f$ is continuous $\lambda$- almost everywhere. For the general case I tried to use the Dominated Convergence Theorem over $f_n:= f(x +a_n) -f(x)$, but It seems false that $f_n \rightarrow 0$ $ \lambda$-a.e. even if $f = \chi_E$ is a characteristic function where E $\in \mathcal{L}$ is a Lebesgue measurable set. Please someone can give me some pointers on this problem? Thanks in advance!
My background: We are following Folland's Real Analysis Book, the last thing we saw was modes of convergence.