Given a point and a line segment, how do you calculate the barycentric coordinates of the point to the line?
bool BarycentricCoordinates(const Vector3& point, const Vector3& a, const Vector3& b, float& u, float& v, float epsilon)
Given a point and a line segment, how do you calculate the barycentric coordinates of the point to the line?
bool BarycentricCoordinates(const Vector3& point, const Vector3& a, const Vector3& b, float& u, float& v, float epsilon)
If you have a point $C$ on a line defined by a point $A$ and a point $B$, here is how you calculate the barycentric coordinates $(t_0,t_1)$:
$t_0 = \frac{sqrt((C_x-A_x)^2+(C_y-A_y)^2)}{sqrt((B_x-A_x)^2+(B_y-A_y)^2)}$
$t_1 = 1.0 - t_0$
In other words, you calculate the length of the vector from $A$ to $C$, and divide that by the length of the vector from $A$ to $B$. This gives you $t_0$, and you can subtract that from 1.0 to get $t_1$ since barycentric coordinates always add up to 1.0.
A more mathish style version of the above is this:
$t_0 = \frac{\left\| C-A\right\|}{\left\| B-A\right\|}$
$t_1 = 1.0 - t_0$
But it's important to note that $C$ must be on the line defined by $A$ and $B$. If that isn't the case, let us know and let us know what it is you are trying to do specifically. For instance, maybe you want to project some other point $D$ onto the line $\overline{AB}$ and then find the barycentric coordinates of that projected point?
a
andb
intended to be points defining the line? How are the barycentric coordinates defined relative to those points? Ispoint
assumed to be on the line, or might it be somewhere off the line? What is the intended role ofepsilon
? – Nathan Reed Jan 29 '17 at 00:04