I want to implement robust line fitting over a set of $n$ points $(x_i,y_i)$ by means of the Least Absolute Deviation method, which minimizes the sum
$$\sum_{i=1}^n |y_i-a-bx_i|.$$
As described for instance in Numerical Recipes, you can find the optimal slope, by canceling the function
$$s(b):=\sum_{i=1}^n x_i\text{ sign}(y_i-bx_i-\text{med}_k(y_k-bx_k))$$
where $b$ is the unknown. To find the change of sign, a dichotomic search is - appropriately - recommended. Anyway, the starting search interval as well as the termination accuracy are based on purely statistical arguments (using a $\chi^2$ test).
I was wondering if there is an analytical way to derive safe bounds for the slopes (i.e. values such that $s(b)$ is guaranteed positive or negative), as well as the termination criterion (minimum of $|s(b)|$). (In fact, I don't even know if the function is monotonic.)
Update:
I have gained some insight in the problem.
If you sort the points on $x$ and set $b$ larger than the slope of all segments between two points, the $y_k-bx_k$ will appear in decreasing order, and the median is at index $n/2$. Then $s(b)$ is the difference of the sum of the $n/2$ largest $x_i$ and the sum of the $n/2$ smallest $x_i$, hence it is certainly positive. And larger values of $b$ make no change.
Similarly, setting $b$ smaller than the smallest slope ensures $s(b)$ negative, and this answers my first question. This can work fine when the $x_i$ are regularly spaced, or close to.
Anyway, in the case such that some of the $x_i$ are equal of very close, the slopes can take high values and the dichotomic process based on the full $[b_{min},b_{max}]$ interval might be inefficient. The ordering of the $y_k-bx_k$ will only change when $b$ crosses the value of one of these $n(n-1)/2$ slopes. Hence, the dichotomic process could be based on the (sorted) slopes rather than on the continuous values of $b$.