I have a sequence of $N$ strictly positive real values $y_n$. They form some kind of peak; for simplicity, let's assume $f(x, \mu) = A \exp^{-(x-\mu)^2}$ is the shape, with $A$ and $\mu$ real (in the end, I want it to work for a peak of fixed but arbitrary shape though). There is some additive gaussian-distributed noise in the $y_n$, i.e. they don't fit $f(x, \mu)$ exactly. My objective is to determine $A$ and $\mu$ according to the optimium given by the least-squares method, i.e. the values minimizing $\chi^2 = \sum_{n=1}^N \left(y_n - f(x_n, \mu)\right)^2$.
Of course, the intuitive approach is to use e.g. the Levenberg-Marquardt algorithm and iterartively determine those parameters.
However, for some classes of problems (esp. those linear in their parameters) explicit solutions do exist. I am wondering whether it is possible to find an explicit solution for this kind of problem as well.
What I tried so far is this: you can calculate the cumulative sum of the $y_n$. The resulting functional relation $g(x_n) = \sum_{j=1}^n y_n$ is strictly monotonic, and thus can be inverted. Looking at the inverted data $g^{-1}$, a change in $\mu$ now simply causes a constant offset. For this problem, an explicit solution of the least-squares problem can easily be derived. This works, but it has ugly properties in practice (e.g. additive noise causes a systematic shift in the resulting parameter distribution, etc).
Any ideas how one could do better?