0

We know that the l2-norm is generated from an inner product by using the parallelogram law.

proof that l2 norm is generated from inner product

What about the squared of the $l_2$-norm? Is that generated from an inner product? I've tried to solve this using the parallelogram law but im not sure of my answer and i think it is no.

But in kernel ridge regression we use the $l_2$-norm squared, and according to the representer theorem the $l_2$-norm squared has to correspond to an inner product.

Will anyone be kind to show the proof to me? Many Thanks!

user577215664
  • 40,625
fxl
  • 3
  • Do you mean the "norm" $|f|4 = \left(\int{\Omega} |f(x)|^4,dx\right)^{\frac{1}{4}}$ ? – Cameron Williams Aug 27 '20 at 19:08
  • i mean the l2 norm squared, as defined in the comment section in this thread ( sorry dont know how to write maths formulas here)

    https://math.stackexchange.com/questions/883016/gradient-of-l2-norm-squared/883024

    – fxl Aug 27 '20 at 19:09
  • Oh! You mean $|x|^2$, rather than $|x|$? – Cameron Williams Aug 27 '20 at 19:10
  • You mean $|(x_1,\dots,x_n)| := x_1^2 + \cdots + x_n^2$? If this is the case, then $|\cdot|$ is not a norm since $|a(x_1,\dots,x_n)| \neq |a||(x_1,\dots,x_n)|$. – azif00 Aug 27 '20 at 19:14
  • yes, not too sure of the convention, is ||x|| the l2 or l1 norm? if ||x|| is the l2 norm then yes that is what i meant. used to seeing a subscript 2 in the l2 norm equation – fxl Aug 27 '20 at 19:15
  • @fxl Forget about indexes, it is the least important thing. What really matters is if you mean the "norm" where you add the squares of the entries of the vector. – azif00 Aug 27 '20 at 19:17
  • yes, azifmedrano you have the equation correct – fxl Aug 27 '20 at 19:18

1 Answers1

0

If $V$ is a real vector space, a norm on $V$ is a function $\|\cdot\| : V \to \mathbb R$, where $v \mapsto \|v\|$, such that the following properties holds:

  • For any $v \in V$, $\|v\| \geq 0$; and $\|v\| = 0$ only if $v$ is the zero vector.
  • For any $v \in V$ and $a \in \mathbb R$, $\|av\| = |a|\|v\|$.
  • For every pair of vectors $v$ and $w$ in $V$, $\|v+w\| \leq \|v\|+\|w\|$.

Then, if $V = \mathbb R^n$ and we define $\|(x_1,\dots,x_n)\|$ by $x_1^2 + \cdots + x_n^2$, $\|\cdot\|$ is not a norm on $V$ since the second property doesn't holds for every $a \in \mathbb R$, unless $a = \pm 1$.

azif00
  • 20,792
  • thx for this azifmedrano, so it does not even make sense to think of the l2 norm squared as a norm in the first place...you are just squaring the norm.....i get it now and have read the theorem wrongly... thanks for the definition, helps alot – fxl Aug 27 '20 at 19:46