7

I need to find the line having minimal distance to all points. I found linear regression and linear interpolation algorithms. But their minimal distance is only in y-axis: $D = y - f(x)$.

But I need to find $a,b,c$ for line: $ax + by + c = 0$ where distance is computed this way: $D_i = \dfrac{|ax_i + by_i + c|}{\sqrt{a^2+b^2}}$

Is there any way or algorithm to solve this problem?

kravemir
  • 431

1 Answers1

4

What you want to do is called total least squares or orthogonal regression. Netlib has a bunch of routines for doing this, and a bit of searching turns up routines for other systems, e.g. MATLAB.

  • Thanks :) This seems to be i want to do. – kravemir Sep 23 '11 at 14:48
  • Someone posted comment about http://en.wikipedia.org/wiki/Principal_component_analysis. Is there any difference with your post? – kravemir Sep 23 '11 at 14:50
  • Certainly, one in fact uses SVD (the machinery behind PCA) for total least squares. See this for instance. – J. M. ain't a mathematician Sep 23 '11 at 14:58
  • @Miro: I posted that comment, but deleted it after I saw J.M.'s answer. J.M.: Boy, doesn't SVD on covariance-like matrices turn up everywhere? From PCA to principal axes of inertia to ellipsoids to this question. I posted the first related thing that came to my head, but it's nice to know it has a specific name in this particular application. –  Sep 23 '11 at 15:07
  • @Rahul: SVD is just too useful for data analysis, methinks. ;) – J. M. ain't a mathematician Sep 23 '11 at 15:09