0

Let $v \in \Bbb R^n$, and $f:\Bbb R^n \to \Bbb R^n$ with $f(x)=\langle x,(vv^T)x\rangle$. Show that $f$ is convex.

I'm looking for different approaches to solve this (rather simple) problem. Every tools are allowed. This is one solution:

The Hessian of $f$ at $x$ is $H_f(x)=2vv^T$, then $\langle y,H_f(x)y\rangle= 2 (v^Ty)^2 \geq 0$ for every $y \in \Bbb R^n$ and thus $H_f(x)$ is positive semi-definite and by the second order condition for convex functions it follows that $f$ is convex.

Hello
  • 383

1 Answers1

0

Here is a more straightforward way:

Let $x,y \in \Bbb R^n$ and write $U = \operatorname{span}(v)$ and $U^{\perp}$ the orthogonal complement of $U$. Then $\Bbb R^n = U \oplus U^{\perp}$. Thus there exists $\alpha,\beta \in \Bbb R$ and $x^{\perp},y^{\perp} \in U^{\perp}$ such that $x = \alpha v+x^{\perp}$ and $y=\beta v +y^{\perp}$. Now, let $t \in [0,1]$, then $tx+(1-t)y=(t\alpha+(1-t)\beta)v+u^{\perp}$ where $u^{\perp} = \alpha x^{\perp}+\beta y^{\perp} \in U^{\perp}$. Note that, $f(z)=\langle z,v\rangle^2$ for any $z\in \Bbb R^n$. So, we have $$ tf(x)+(1-t)f(y)-f(tx+(1-t)y) = \|v\|\big(t\alpha^2+(1-t)\beta^2-t^2\alpha^2 -2t(1-t)\alpha\beta-(1-t)^2\beta^2\big) \\ = \|v\|t(1-t)(\alpha-\beta)^2 \geq 0.$$ It follows that $f$ is convex.

Surb
  • 55,662