2

I am self studying Functional Analysis and came across this proposition.

Suppose, $f$ is a continuous linear functional defined on a Normed Linear space $V$, then

$$|f(x)| \le \| x\|\| f\|$$

I don't get how this inequality is derived, I know $f : V \to \mathbb{R}$ is a continuous linear map

So, we have $\| f(x)\| \le \| f\|\| x\|$

Now, are we assuming the norm on $\mathbb{R}$ to be the usual $l^1$ norm in this case?

Asinomás
  • 105,651
  • I don't know if it helps, but every norm in $\mathbb{R}$ is of the form $||x||=\lambda|x|$, where $\lambda>0.$ – Victor Jul 07 '21 at 15:14
  • For a continuous linear map $f : V \rightarrow \mathbb R$, we usually use the standard topology on $\mathbb R$ given by the euclidean norm $||x||_2 = |x|$. In fact, it does not matter which norm you use on $\mathbb R$, they are all equivalent (see https://math.stackexchange.com/questions/2890009/all-norms-of-mathbb-rn-are-equivalent ). – Desura Jul 07 '21 at 15:19
  • 1
    @Desura: So, this means for any finite dimensional space, the choice of norm is irrelevant right? as they are equivalent to each other. – night_crawler Jul 07 '21 at 15:32
  • 3
    Could you define $|f|$ ? or do you mean $|f| := \sup{ |f(x)| : |x| = 1 }$ ? – Physor Jul 07 '21 at 15:36
  • 1
    @Physor: Yes, This is the definition I am using. – night_crawler Jul 07 '21 at 15:37
  • @night_crawler Yes you are right, for finite dimensional normed spaces, the choice of norm does not matter. – Desura Jul 07 '21 at 16:44
  • That's practically the definition of a a bounded linear functional. Even for infinite spaces it doesn't matter. – Alan Jul 07 '21 at 18:11

3 Answers3

0

Yes I think you're right that we're assuming the standard norm on R. You could also check the definition of the norm of a linear functional given in your text. It may explicitly use the absolute value.

Rioghasarig
  • 1,789
0

Adding to the answer of Physor, which proved the inequality for all norms in $\mathbb{R}$, in particular for the usual one. I added a answer for your second question:

Yes, he is assuming that the norm in $\mathbb{R}$ is the usual one. The reason for this is that in $\mathbb{R}$, since it is a finite dimensional vector space, all norms are equivalent, and then all of these norms have the form $\lambda |x|$ where $\lambda>0$ is a fixed real number. And so the inequality that are you considering changes only by a multiplication by constant when changing the norms in $\mathbb{R}$, which does not matter in terms of convergence. So, yes, there is no loss in generality in consider only the usual norm in this case.

Victor
  • 289
-2

This is a direct proof:

Let $f:V \to W$ be a continuous linear map between normed spaces $V,W$, i.e. $$ \forall \varepsilon > 0 :\exists \delta>0 :\forall x \in V:(\|x\| < \delta \implies\|f(x)\| < \varepsilon). $$ It follows that for every preassigned $\varepsilon$ and as long as $ 0 <\|x\| < \delta(\varepsilon)$ we have $$ \|f(x)\| < \varepsilon \implies \frac{\|f(x)\|}{\|x\|} < \frac{\varepsilon}{\|x\|}. $$ Now we can take the infimum on RHS of the last inequality $$ \frac{\|f(x)\|}{\|x\|} \le \inf_{0 < \|x\| < \delta(\varepsilon)} \left\{ \frac{\varepsilon}{\|x\|} \right\} = \frac{\varepsilon}{\delta(\varepsilon)}=:\alpha(\varepsilon) < \infty, $$ since $\delta(\varepsilon) > 0$. Thus we proved, since the RHS of the last inequality doesn't depend on $\|x\|$, that $$ \exists\alpha \ge 0 :\forall x \in V: \|f(x)\| \le\alpha \|x\| $$ which is boundedness of $f$. The infimum of all these $\alpha(\varepsilon)$ is the norm $\|f\|$ of $f$.

Physor
  • 4,586