I want to minimize the following convex function over $\delta$:
$$g(\delta) = \sum_i \sum_j | \Sigma_{ij} - S_{ij} - \delta v_iv_j |$$
Here all variables $\Sigma_{ij}, S_{ij}, v_i, v_j$ are constants. How can I do so?
Attempt
I found a similar problem, where the second answer suggests using $\frac{d|x|}{dx} = \operatorname{sign}(x)$, and thus I can try setting the subgradient equal to 0
$$\frac{\partial}{\partial \delta} g(\delta) = \sum_i \sum_j \operatorname{sign}(\Sigma_{ij} - S_{ij} - \delta v_iv_j) v_iv_j = 0$$
However, I don't know how to solve for $\delta$.