0

This question is related to Minimize variance of a linear function but with an additional contraint.


Given a matrix $A \in \mathbb{R}^{n\times m}$ and vector $\vec{b} \in \mathbb{R}^{m}$, let $ \vec{m}(\vec{x})=A\vec{x}-\vec{b}$ .

$$ \min_\vec{x} \operatorname {var}[ \vec{m}(\vec{x}) ] \quad \mbox{s.t.} \quad x_i \geq 0 \quad\forall i \label{bla}, \tag{1} $$

where the variance can be written as

$$\operatorname {var} (\vec{m})={\frac {1}{n}}\sum _{i=1}^{n}(m_{i}-\mu )^{2}$$

with $\mu =\frac1n \sum _{i=1}^{n}m_{i}$. Right now i am just using this approach with the Matlab's lsqnonneg,

$$ \min_\vec{x} \| (A\vec{x}-\vec{b} +\delta \cdot \mathbf{1})\|_{2}^{2} \quad \mbox{s.t.} \quad x_i \geq 0 \quad\forall i$$

where I loop through diffent values of $\delta$ to "probe" the unknown value of the mean of $A\vec{x}-\vec{b}$. Then I take the solution where the result is smallest for all the $\delta$ tested.

How can i do this optimization more efficient? Is there a way to solve equation (1) directly?

kiara
  • 213

1 Answers1

1

Let $\operatorname{Var} : \Bbb R^n \to \Bbb R_0^+$ be defined by

$$ \operatorname{Var} ({\bf y}) := \frac1n \sum_{i=1}^n \left( y_i - \bar y \right)^2 = \dots = \frac1n \left\| \left( {\bf I}_n - \frac1n {\bf 1}_n {\bf 1}_n^\top \right) {\bf y} \right\|_2^2 = {\bf y}^\top \left( {\bf I}_n - \frac1n {\bf 1}_n {\bf 1}_n^\top \right) {\bf y}$$

where ${\bf I}_n - \frac1n {\bf 1}_n {\bf 1}_n^\top$ is the (idempotent) projection matrix that projects onto the $(n-1)$-dimensional linear subspace orthogonal to ${\bf 1}_n$. Hence,

$$ \begin{array}{ll} \underset {{\bf x}} {\text{minimize}} & \operatorname{Var} \left( {\bf A} {\bf x} - {\bf b} \right) \\ \text{subject to} & {\bf x} \geq {\bf 0}_n \end{array} $$

is a convex quadratic program that can be solved using, say, Matlab's quadprog.