2

From Proving that quadratic form is convex in (vector, matrix) arguments we know that $$f(Q,x) = x^T Q x$$ is a convex function jointly in $Q$ and $x$ when $Q\succeq 0$. How can I optimize with respect to this in CVX?

Supposedly, trace_inv is supposed to be related but I don't see how to use trace_inv to implement a minimization of $f(Q,x)$.

Paul Zhang
  • 99
  • 6
  • 2
    Post this on StackOverflow - not appropriate for MSE – Igor Rivin Apr 01 '21 at 22:57
  • 1
    The question you linked to concerns the function $f(x,Q) = x^T Q^{-1} x$, with $Q$ restricted to be positive definite. Is the fact that $Q^{-1}$ rather than $Q$ appears in the function not important? (See @MichaelGrant's comment about CVX's trace_inv function in the linked question.) – littleO Apr 04 '21 at 01:36

1 Answers1

4

I assume you want to minimize a convex function. Let $\Bbb S_n^+ (\Bbb R)$ denote the set of $n \times n$ symmetric positive definite matrices. Rephrasing, we have function $f : \Bbb S_n^+ (\Bbb R) \times \Bbb R^n \to \Bbb R_0^+$ defined by

$$f \left( {\rm X}, {\rm y} \right) := {\rm y}^\top {\rm X}^{-1} {\rm y}$$

Introducing a new optimization variable, $z \in \Bbb R$, in epigraph form, minimizing $f$ can be written as minimizing $z$ subject to ${\rm y}^\top {\rm X}^{-1} {\rm y} \leq z$. Using the Schur complement, the inequality constraint can be rewritten as the following linear matrix inequality (LMI)

$$\begin{bmatrix} {\rm X} & {\rm y}\\ {\rm y}^\top & z\end{bmatrix} \succeq {\rm O}_{n+1}$$

which defines a spectrahedron. Thus, your (convex) optimization problem can be rewritten as a (convex) semidefinite program (SDP) that should be easy to solve in CVX.