Given the matrix $A$ and the vector $c$, I would like to minimize the variance of $Ax+c$, i.e.,
$$x = \arg \min_x \operatorname{var} ( A x + c )$$
How would you solve this?
Given the matrix $A$ and the vector $c$, I would like to minimize the variance of $Ax+c$, i.e.,
$$x = \arg \min_x \operatorname{var} ( A x + c )$$
How would you solve this?
Let $y = [1/N; 1/N; \dots; 1/N]$ such that $y^{\mathrm{T}}\left(Ax+c\right)$ is the mean of $(Ax+c)$. Then differentiate $\Vert (Ax+c) - y^{\mathrm{T}}\left(Ax+c\right) \Vert_2^2$, set it equal to zero, and then solve for $x$.
Edit:
Here's another possible way that just uses the usual least squares approach. Let $\vec{1}\in\mathbb{R}^m$ be a vector of all 1's and $y\in\mathbb{R}^m$ be as above. Then we want the LS approximation to
$$ Ax = \vec{1}y^{\mathrm{T}}(Ax+c) - c $$
Using the normal equations, this works out to
$$ \begin{array}{c} \left(A^{\mathrm{T}}A - A^{\mathrm{T}}\vec{1}y^{\mathrm{T}}A\right)x = A^{\mathrm{T}} \, \vec{1}y^{\mathrm{T}} c - A^{\mathrm{T}}c \\ \Rightarrow x = \left(A^{\mathrm{T}}A - A^{\mathrm{T}}\vec{1}y^{\mathrm{T}}A\right)^{-1} \left( A^{\mathrm{T}} \,\vec{1}y^{\mathrm{T}} c - A^{\mathrm{T}}c \right) \end{array} $$
You can try each of these. If they are correct, they will give the same answer.
To test the above, I used the following in Matlab:
A = rand(5,5); # Random input.
c = rand(5,1); # Random input.
y = 1/5 * ones(5,1); # Mean computing vector.
# Q is the matrix we need to invert. It is not full
# rank, so we use the psuedo-inverse to get the minimum
# norm solution, though any solution is equally correct.
Q = ( A' * A - A' * ones(5,1) * y' * A );
x = pinv(Q) * ( A' * ones(5,1) * y' * c - A' * c );
# See what variance this answer gives.
var( A*x + c )
ans =
1.2942e-31
# Since Q has rank N-1, we know there are infinitely many
# solutions along a line in R^N. We know the nullspace has
# dimension 1, so we don't have to check its size.
n = null(Q);
var( A * (x + 100*rand(1)*n) + c )
ans =
2.4297e-28