45

I have this problem I'm working on. Hints are much appreciated (I don't want complete proof):

In a normed vector space, if $ x_n \longrightarrow x $ then $ z_n = \frac{x_1 + \dots +x_n}{n} \longrightarrow x $

I've been trying adding and subtracting inside the norm... but I don't seem to get anywhere.

Thanks!

3 Answers3

45

Given $ \epsilon >0$ there exists $ n_0 $ such that if $ n\geq n_0 $ then $\parallel x_n -x\parallel < \epsilon $

so

\begin{align*} 0 & \leq \left\lVert \frac{x_1 +\cdots +x_n}{n} -x \right\rVert \leq \left\lVert \frac{x_1 + \dots + x_n - nx }{n} \right\rVert \\ & \leq \frac{\lVert x_1 - x \rVert}{n} + \dots + \frac{\lVert x_{n_0 - 1} - x \rVert}{n} + \frac{\lVert x_{n_0} - x \rVert}{n} +\dots + \frac{\lVert x_{n} - x \rVert}{n} \\ &\le \frac 1n\sum_{i=1}^{n_0-1} \| x_i -x\| + \frac{n-n_0}{n} \epsilon \end{align*}

The first $n_0 -1$ terms $\| x_i -x\|$ can be bounded by some $M$, thus for $n\ge (n_0-1)M/\epsilon=: N_0$ we have $$\frac 1n\sum_{i=1}^{n_0-1} \| x_n -x\| \le \frac 1n (n_0-1)M \le \epsilon$$

Thus $$\left\| \frac{x_1 + \cdots x_n}{n} - x\right\| <2\epsilon$$ when $n\ge N_0$.

Thanks a lot @Leonid Kovalev for the inspiration, though my main problem was that I wasn't aware of what to do with the $nx$ (the silliest part :P)

  • 6
    This sort of thing is encouraged, I think. – Dylan Moreland Jun 11 '12 at 03:21
  • I think you want "$-nx$" in the first line of the display, and $|x_n - x|$ at the end. I don't think you want to say that the first few terms are $\leq M$; that doesn't seem to be enough. – Dylan Moreland Jun 11 '12 at 03:41
  • @DylanMoreland: Why isn't it enough? Can you explain? – Bouvet Island Jun 11 '12 at 14:56
  • 6
    @Inti: I think you are missing something in the argument. Generally the argument consists of two steps: first choose $n_0$ such that if $n \geq n_0$, $|x_n -x | < \epsilon / 2$. Next choose $N \geq n_0$ such that $M$ (which is at least $\sup_n |x_n|$) satisfies $M / N < \epsilon / 2$. I don't see the second part of the argument implemented in your answer. – Willie Wong Jun 11 '12 at 14:59
  • 5
    Should it not be $(n-n_0+1)\epsilon/n$? – EllipticalInitial Oct 14 '19 at 04:19
  • Regarding "The first $n_0−1$ terms $|x_i - x|$ can be bounded by some $M$" - with respect to $n$, the sum $\sum_{i=1}^{n_0-1}|x_i - x|$ is a constant, so you might as well just call it $M$. –  Dec 12 '20 at 00:09
14

There is a slightly more general claim:

PROP Let $\langle a_n\rangle$ be a sequence of real numbers, and define $\langle \sigma_n\rangle$ by $$\sigma_n=\frac 1 n\sum_{k=1}^n a_k$$

Then $$\liminf_{n\to\infty}a_n\leq \liminf_{n\to\infty}\sigma_n \;(\;\leq\;)\;\limsup_{n\to\infty}\sigma_n\leq \limsup_{n\to\infty}a_n$$

P We prove the leftmost inequality. Let $\ell =\liminf_{n\to\infty}a_n$, and choose $\alpha <\ell$. By definition, there exists $N$ such that $$\alpha <a_{N+k}$$ for any $k=0,1,2,\ldots$ If $m>0$, then $$m\alpha <\sum_{k=1}^m \alpha_{N+k}$$

which is $$m\alpha<\sum_{k=N+1}^{N+m}a_k$$

$$(m+N)\alpha+\sum_{k=1}^{N}a_k<\sum_{k=1}^{N+m}a_k+N\alpha$$

which gives

$$\alpha+\frac{1}{m+N}\sum_{k=1}^{N}a_k<\frac{1}{m+N}\sum_{k=1}^{N+m}a_k+\frac{N}{m+N}\alpha$$

Since $N$ is fixed, taking $\liminf\limits_{m\to\infty}$ gives $$\alpha \leq \liminf\limits_{m \to \infty } \frac{1}{m}\sum\limits_{k = 1}^m {{a_k}} $$ (note that $N+m$ is just a shift, which doesn't alter the value of the $\liminf^{(1)}$). Thus, for each $\alpha <\ell$, $$\alpha \leq \liminf\limits_{m \to \infty } \frac{1}{m}\sum\limits_{k = 1}^m {{a_k}} $$ which means that $$\liminf_{n\to\infty}a_n\leq \liminf_{n\to\infty}\sigma_n$$ The rightmost inequality is proven in a completely analogous manner. $\blacktriangle$.

$(1)$: Note however, this is not true for "non shift" subsequences, for example $$\limsup_{n\to\infty}(-1)^n=1$$ but $$\limsup_{n\to\infty}(-1)^{2n+1}=-1$$

COR If $\lim a_n$ exists and equals $\ell$, so does $\lim \sigma_n$, and it also equals $\ell$. The converse is not true.

Pedro
  • 122,002
2

WLOG, the $x_n$ converge to $0$ (otherwise consider the differences $x_n-x$), and stay confined in an $\epsilon$-neighborhood of $0$ after $N_\epsilon$ terms.

Then the average of the first $m$ terms is bounded by

$$\frac{N\overline{x_N}+(m-N)\epsilon}m,$$ which converges to $\epsilon$. So you can make the average as close to $0$ as you like.