1

Given a point and a line segment, how do you calculate the barycentric coordinates of the point to the line?

bool BarycentricCoordinates(const Vector3& point, const Vector3& a, const Vector3& b, float& u, float& v, float epsilon)

Alan Wolfe
  • 7,801
  • 3
  • 30
  • 76
MOJOMO
  • 21
  • 3
  • 1
    Needs more clarification. Are a and b intended to be points defining the line? How are the barycentric coordinates defined relative to those points? Is point assumed to be on the line, or might it be somewhere off the line? What is the intended role of epsilon? – Nathan Reed Jan 29 '17 at 00:04
  • yes, a and b are intended to be points defining the line. yes, How are the barycentric coordinates defined relative to those points? point can be on or somewhere off the line. epsilon is the error range – MOJOMO Jan 29 '17 at 00:25
  • Would you expect the barycentric coordinates of the point to be the same if the point is projected to the line? – PeteUK Jan 29 '17 at 17:08
  • yes, barycentric coordinates of the point should be the same if the point is projected to the line – MOJOMO Jan 29 '17 at 20:12
  • 1
    Baryenctric coordinates are used for shapes likes triangles or more complex shapes. Not sure how you can define them with respect to a line? Doesn't make a lot of sense to me? It would be good if you could edit your question and explain in more details what you are looking for (what's the problem at hand) – user18490 Jan 29 '17 at 21:26
  • Barycentric coordinates apply to all SIMPLICES (plural of simplex). A line is a 1 dimensional simplex. – Alan Wolfe Jan 30 '17 at 19:40

1 Answers1

2

If you have a point $C$ on a line defined by a point $A$ and a point $B$, here is how you calculate the barycentric coordinates $(t_0,t_1)$:

$t_0 = \frac{sqrt((C_x-A_x)^2+(C_y-A_y)^2)}{sqrt((B_x-A_x)^2+(B_y-A_y)^2)}$

$t_1 = 1.0 - t_0$

In other words, you calculate the length of the vector from $A$ to $C$, and divide that by the length of the vector from $A$ to $B$. This gives you $t_0$, and you can subtract that from 1.0 to get $t_1$ since barycentric coordinates always add up to 1.0.

A more mathish style version of the above is this:

$t_0 = \frac{\left\| C-A\right\|}{\left\| B-A\right\|}$

$t_1 = 1.0 - t_0$

But it's important to note that $C$ must be on the line defined by $A$ and $B$. If that isn't the case, let us know and let us know what it is you are trying to do specifically. For instance, maybe you want to project some other point $D$ onto the line $\overline{AB}$ and then find the barycentric coordinates of that projected point?

Alan Wolfe
  • 7,801
  • 3
  • 30
  • 76