I'm trying to minimize $\frac{1}{2}||x - d||^2 + \lambda ||x||$ with respect to $x$ where the norm concerned is the $L_2$ norm, and $x$ and $d$ are vectors.
I think the answer I should be arriving at is $[1 - \frac{\lambda}{||d||}]_+ d$.
EDIT: In an attempt to answer my own question after learning up on subgradients: The optimality condition has $$0 \in x - d + \lambda \partial ||x|| $$ where $\partial$ is denoting the subgradient. It now branches off to two scenarios:
1) If $x=0$, then the optimality condition becomes $$0 \in -d + \lambda \{g : ||g||\leq 1 \}$$ Rearranging the terms yields that $||d|| \leq \lambda $. Thus, the minimizer in this case is $\hat{x} = 0$ when $||d|| \leq \lambda$.
2) If $x \neq 0$, then the optimality condition becomes $$ 0 = x - d + \lambda \frac{x}{||x||}$$ which implies $x = d - \lambda \frac{x}{||x||}$. The next step is $$ x = d - \lambda \frac{x}{||x||} \iff \hat{x} = d - \lambda \frac{d}{||d||} \tag{*}$$ How does one arrive at and intuit step $(*)$? I can verify that it is true, but do not know how I would have derived it had I not known the answer I'm supposed to get to. I can finish the rest, but I would really appreciate help on step $(*)$! Thanks in advance.