28

While looking over my notes, my lecturer stated the following inequality; $$\|x\|_2 \leq \|x\|_1 \leq \sqrt{n}\|x\|_2$$ where $x \in \mathbb{R^n}.$ There was no proof given, and I've been trying to prove it for a while now. I know the definitions of the $1$ and $2$ norm, and, numerically the inequality seems obvious, although I don't know where to start rigorously.

Thank you.

DMcMor
  • 9,407
the man
  • 2,404

5 Answers5

27

We will show the more general case, i.e.:

$\| \cdot \|_1$ , $\| \cdot \|_2$, and $\| \cdot \|_{\infty}$ are all equivalent on $\mathbb{R}^{n}$. And we have $$\| x \|_{\infty} \leq \| x \|_{2} \leq \| x \|_{1} \leq n \| x \|_{\infty}\ $$

Every $x \in \mathbb{R}^{n}$ has the representation $x = ( x_1 , x_2 , \dots , x_n )$. Using the canonical basis of $\mathbb{R}^{n}$, namely $e_{i}$, where $e_i = (0, \dots , 0 , 1 , 0 , \dots , 0 )$ for $1$ in the $i^{\text{th}}$ position and otherwise $0$, we have that $$\| x \|_{\infty} = \max_{1\leq i \leq n} | x_i | = \max_{1\leq i \leq n} \sqrt{ | x_i |^{2} } \leq \sqrt{ \sum_{i=1}^{n} | x_ i |^{2} } = \| x \|_2 $$ Additionally, $$ \| x \|_2 = \sqrt{ \sum_{i=1}^{n} | x_i |^{2} } \leq \sum_{i=1}^{n} \sqrt{ | x_ i |^{2} } = \sum_{i=1}^{n} |x_i| = \| x \|_1$$ Finally, $$ \| x \|_1\ = \sum_{i=1}^{n} |x_i| \leq \sum_{i=1}^{n} | \max_{1 \leq j \leq n} x_j | = n \max_{1 \leq j \leq n} | x_j | = n \| x \|_{\infty}$$ showing the chain of inequalities as desired. Moreover, for any norm on $\mathbb{R}^{n}$ we have that: $$\| x - x_{n} \|\ \to 0 \hspace{1cm} \text{as} \space\ \space\ n \to \infty $$ so that they are equivalent, as this holds for any $x \in \mathbb{R}^{n}$ under any norm.

Dragonite
  • 2,388
  • 2
    If I wanted to more explicitly show that $|\cdot|1$ was equivalent to $|\cdot|_2$ could I use the fact that $|x|_1\le\ n|x|\infty\iff\frac1n |x|1\le|x|\infty$ to deduce that $\frac1n |x|_1\le |x|_2\le |x|_1$? Or would I need to do something more explicit? – Jeremy Jeffrey James Mar 13 '18 at 22:05
  • 19
    @JeremyJeffreyJames This should work:

    $$ |\ x |2 = \sqrt{ \sum{i=1}^{n} | x_i |^{2} } \leq \sum_{i=1}^{n} \sqrt{ | x_ i |^{2} } = \sum_{i=1}^{n} |x_i| = |\ x |1.$$ Then, using the Cauchy–Schwarz inequality we get for all $x\in\mathbb{R}^n$: $$ \Vert x\Vert_1= \sum\limits{i=1}^n|x_i|= \sum\limits_{i=1}^n|x_i|\cdot 1\leq \left(\sum\limits_{i=1}^n|x_i|^2\right)^{1/2}\left(\sum\limits_{i=1}^n 1^2\right)^{1/2}= \sqrt{n}\Vert x\Vert_2, $$

    – Dragonite Mar 19 '18 at 12:50
  • 3
    @Dragonite it is not clear how you get $\sqrt{ \sum_{i=1}^{n} | x_i |^{2} } \leq \sum_{i=1}^{n} \sqrt{ | x_ i |^{2} }$. Could you please offer more details? – johnny09 Apr 20 '19 at 23:42
  • 5
    @johnny09 see clark's answer below, the square root is subadditive. – dasWesen May 10 '20 at 09:05
20

The inequality $ \|x\|_1 \leq \sqrt{n} \,\|x\|_2 $ is a consequence of Cauchy-Schwarz. To see this

$$\sqrt n\, \|x\|_2 =\sqrt{1+1+\cdots+1}\,\sqrt{\sum_{i} x_i^2 }\geq \|x\|_1$$

For the first, the function $f(x)=\sqrt{x}$ is concave and $f(0)=0$, hence $f$ is subadditive

Therefore $ f(\sum_{i} x_i^2 )\leq \sum_{i} f(x_i^2) =\|x\|_1 $

clark
  • 15,327
5

Another way to show that $||x||_1 \leq \sqrt{n}||x||_2$ is as follows:

\begin{align*} \begin{array}{c} \displaystyle 0\leq \sum_{i\neq j}\big(|x_i|-|x_j|\big)^2 = (n-1)\sum_{i=1}^{n} |x_i|^2 - 2\sum_{i\neq j}|x_i||x_j| \\ \displaystyle\Rightarrow\quad 2\sum_{i\neq j}|x_i||x_j|\leq (n-1)\sum_{i=1}^{n} |x_i|^2 \\ \displaystyle\Rightarrow\quad \sum_{i=1}^{n} |x_i|^2 + 2\sum_{i\neq j}|x_i||x_j|\leq n\sum_{i=1}^{n} |x_i|^2 \\ \displaystyle\Rightarrow\quad \left( \sum_{i=1}^n |x_i| \right)^2 \leq n\sum_{i=1}^{n} |x_i|^2 \\ \displaystyle\Rightarrow \quad ||x||_1\leq \sqrt{n}||x||_2 \end{array} \end{align*}

3

$\lVert x \rVert_2 \le \lVert x \rVert_1$ is equivalent to $\lVert x \rVert_2^2 \le \lVert x \rVert_1^2$ (norms are non-negative) which can be shown in an elementary way:

$$\lVert x \rVert_2^2 = \sum_i \lvert x_i\rvert^2 \le \left( \sum_i \lvert x_i \rvert \right)^2 = \lVert x \rVert_1^2$$

By expanding the product $\left(\sum_i |x_i| \right)^2 = \sum_i |x_i|^2 + \sum_{i \neq j} |x_i| |x_j| $ where all cross-terms $|z_i| |z_j| \ge 0$.


Intuition for bounds on 2-norm: if $x$ has one component $x_i$ much larger (in magnitude) than the rest, the other components become negligible, so $\lVert x \rVert_2 \approx \sqrt{x_i^2} = |x_i| \approx \lVert x \rVert_1$.

On the other hand, if the components of $x$ are about equal (in magnitude), $\lVert x \rVert_2 \approx \sqrt{n x_i^2} = \sqrt n \lvert x_i \rvert$ while $\lVert x \rVert_1 \approx n \lvert x_i \rvert$, so $\lVert x \rVert_1 \approx \sqrt n \lVert x \rVert_2 $.


In general, by Hölder's inequality, for $1 \le p \le q$, $$\lVert x \rVert_q \le \lVert x \rVert_p \le n^{1/p - 1/q} \lVert x \rVert_q $$

See Inequalities in $l_p$ norm

qwr
  • 10,716
0

Hint: Bound $\left \lVert x\right \rVert_2$ by $\left \lVert x'\right \rVert_2$ where $x'$ is the vector with all its components equal to the maximum of the components of $x$.