I want to try and use Gauss-Newton in order to estimate a solution to the regression problem with normalizing factor $$\min_{x \in \mathbb{R}^n}: \|y - Ax\|_2^2 + \lambda\|x\|_1.$$
To do this, I have to rewrite this minimization as a non-linear least squares $\min_{x \in \mathbb{R}^n} \|r(x)\|^2$ where $r: \mathbb{R}^n \to \mathbb{R}^m$. I'm having some trouble defining $r_i$ such that this works out.
I want $r_i$ such that $r_i^2(x) = (y_i - a_i^Tx)^2 + \frac{\lambda}{m}\|x\|_1$, but finding a way to get the square to equal the $\ell_1$ norm of $x$ is proving difficult.
I'm considering rewriting $$ \|x\|_1 = \|B(x)x\|_2$$ where $B(x)$ is a diagonal matrix with $B(x)_{ii} = \frac{1}{\sqrt{x_i}}$ so the problem becomes
$$\min_{x \in \mathbb{R}^n} \|y - Ax\|_2^2 + \lambda\|B(x)x\|_2^2$$ which can be combined using weighted least squares to get a fairly ugly expression for $r_i$. Anyone have a good sense of whether this will be easy to apply G-N to?