1

Could you please explain, how to find the equations of two straight lines using the joint equation - $ax^2+2hxy+by^2+2gx+2fy+c=0$.

To convert the given pair of straight lines into the joint equation, I would just multiply the two equations as given below:

Let $a_1x+b_1y+c_1=0$ and $a_2x+b_2y+c_2=0$ be two lines. To find the joint equation, I would just multiply them and simplify $(a_1x+b_1y+c_1)(a_2x+b_2y+c_2)=0$. But I wish to know how to do the reverse process, i.e., finding the equations of two lines from the joint equation.

Vishnu
  • 1,816
  • 2
    An approach that immediately comes to mind from what you’ve got so far is to expand the product, equate coefficients, and solve the resulting system of equations. – amd Aug 24 '19 at 07:45
  • @amd, Thank you for that method. Is there any other method to do so? – Vishnu Aug 24 '19 at 07:47
  • 1
    Well, if you’re lucky, you can factor the equation directly. That’s unlikely outside of artificially-constructed exercises and exam questions. The lines are the asymptotes of the family of hyperbolas obtained by varying $c$. See https://math.stackexchange.com/q/898005/265466 for a couple of methods of computing them. – amd Aug 24 '19 at 08:57
  • 1
    If you expand the multiplication you get $y^2-m_2yx-c_2y-m_1yx+m_1m_2x^2+m_1c_2x-c_1y+m_2c_1x+c_1c_2$. Grouping terms to look like the equation you have, then comparing the coefficients and solving may lead you somewhere. – NoChance Aug 24 '19 at 12:33

3 Answers3

3

In general, the equation $\ ax^2+2hxy+by^2+2gx+2fy+c=0\ $ defines a conic, of which two intersecting straight lines is one (degenerate) special case. One standard way to determine what form of conic the equation represents is to diagonalise the matrix, $$ A=\pmatrix{a&h\\h&b}\ $$ by finding its eigenvalues and egenvectors. Let $\ \lambda_1, \lambda_2\ $ be the eigenvalues, and $\ \boldsymbol{e}_1, \boldsymbol{e}_2\ $ the corresponding normalised eigenvectors, which can be chosen so that $\ \boldsymbol{e}_1=\pmatrix{\cos\theta\\-\sin\theta}\ $ and $\ \boldsymbol{e}_2=\pmatrix{\sin\theta\\\cos\theta}\ $ for some angle $\ \theta\ $. If $\ \Theta\ $ is the matrix with columns $\ \boldsymbol{e}_1\ $ and $\ \boldsymbol{e}_2\ $, then $\ \Theta\ $ is a rotation matrix, with $\ \Theta^{-1} = \Theta^\top\ $, and $$ \Theta^\top A\Theta = \pmatrix{\lambda_1 & 0\\0&\lambda_2}\ . $$ Now, the equation we are interested in can be written as \begin{eqnarray} 0&=& \boldsymbol{x}^\top A\boldsymbol{x} + 2\boldsymbol{g}^\top\boldsymbol{x} + c\\ &=& \left(\Theta^\top \boldsymbol{x}\right)^\top\Theta^\top A\Theta\left(\Theta^\top\boldsymbol{x}\right) + 2\left(\Theta^\top\boldsymbol{g}\right)^\top \Theta^\top\boldsymbol{x}+c\\ &=&\boldsymbol{x}'^\top\Lambda\boldsymbol{x}'+2\boldsymbol{g}'^\top\boldsymbol{x}'+c\ , \end{eqnarray} where $\ \boldsymbol{x}=\pmatrix{x\\y}\ $, $\ \boldsymbol{x}'=\Theta^\top\boldsymbol{x}\\$, $\ \Lambda = \pmatrix{\lambda_1 & 0\\0&\lambda_2}\ $, $\ \boldsymbol{g}=\pmatrix{g\\h}\ $, and $\ \boldsymbol{g}'=\Theta^\top\boldsymbol{g}\ $. The entries of $\ \boldsymbol{x}'\ $, $\ x_1'=x\cos\theta+y\sin\theta\ $, and $\ x_2'=-x\sin\theta+y\cos\theta\ $, are the coordinates of a point $P$ with respect to a set of axes that have been rotated clockwise through an angle $\ \theta\ $relative to the original axes, where $\ x\ $ and $ y\ $ are the coordinates of $P$ with respect to those original axes. It follows from above that the equation the the conic with respect to the new axes is \begin{eqnarray} 0 &=& \lambda_1 x_1'^2 + \lambda_2x_2'^2 +2g_1'x_1'+ 2g_2'x_2' + c\\ &=& \lambda_1\left(x_1' +\frac{g_1'}{\lambda_1}\right)^2 + \lambda_2\left(x_2' +\frac{g_2'}{\lambda_2}\right)^2 +c - \frac{g_1'^2}{\lambda_1}-\frac{g_2'^2}{\lambda_2}\ . \end{eqnarray} This is the equation of two intersecting straight lines if and only if $\ \lambda_1\ne0\ $, $\ \lambda_2\ne0\ $, $\ \lambda_1\ $ and $\ \lambda_2\ $ are of opposite sign, and $\ c - \frac{g_1'^2}{\lambda_1}-\frac{g_2'^2}{\lambda_2}=0\ $. If this is the case, suppose, without loss of generality, that $\ \lambda_1>0\ $ and $\ \lambda_2<0\ $. Then the above equation becomes \begin{eqnarray} 0 &=& \left(\sqrt{\lambda_1}x_1' + \frac{g_1'}{\sqrt{\lambda_1}}\right)^2- \left(\sqrt{-\lambda_2}x_2' - \frac{g_2'}{\sqrt{-\lambda_2}}\right)^2\\ &=& \left(\sqrt{\lambda_1}x_1' + \sqrt{-\lambda_2}x_2'+ \frac{g_1'}{\sqrt{\lambda_1}}-\frac{g_2'}{\sqrt{-\lambda_2}}\right)\\ &&\ \ \ \cdot \left(\sqrt{\lambda_1}x_1' - \sqrt{-\lambda_2}x_2'+ \frac{g_1'}{\sqrt{\lambda_1}}+\frac{g_2'}{\sqrt{-\lambda_2}}\right)\ , \end{eqnarray} and the equations of the two straight lines in the new coordinates are \begin{eqnarray} \sqrt{\lambda_1}x_1' + \sqrt{-\lambda_2}x_2'+ \frac{g_1'}{\sqrt{\lambda_1}}-\frac{g_2'}{\sqrt{-\lambda_2}}&=&0\ \ \mbox{, and}\\ \sqrt{\lambda_1}x_1' - \sqrt{-\lambda_2}x_2'+ \frac{g_1'}{\sqrt{\lambda_1}}+\frac{g_2'}{\sqrt{-\lambda_2}} &=&0\ . \end{eqnarray} Finally, substituting $\ x_1'=x\cos\theta+y\sin\theta\ $, and $\ x_2'=-x\sin\theta+y\cos\theta\ $ in these equations, we get the equations of the lines in the original coordinates: \begin{eqnarray} x\left(\sqrt{\lambda_1}\cos\theta - \sqrt{-\lambda_2}\sin\theta\right)&+ &y\left(\sqrt{\lambda_1}\sin\theta +\sqrt{-\lambda_2}\cos\theta\right)\\ &+& \frac{g_1'}{\sqrt{\lambda_1}}-\frac{g_2'}{\sqrt{-\lambda_2}}&=&0 \end{eqnarray} and \begin{eqnarray} x\left(\sqrt{\lambda_1}\cos\theta + \sqrt{-\lambda_2}\sin\theta\right)&+ &y\left(\sqrt{\lambda_1}\sin\theta -\sqrt{-\lambda_2}\cos\theta\right)\\ &+& \frac{g_1'}{\sqrt{\lambda_1}}+\frac{g_2'}{\sqrt{-\lambda_2}}&=&0\ . \end{eqnarray}

lonza leggiera
  • 28,646
  • 2
  • 12
  • 33
  • This is a very advanced and detailed answer! Thank you for sharing. – NoChance Aug 24 '19 at 16:52
  • A well-written answer, but computing the principal axes and half-axis lengths of the conic via finding eigenvalues and eigenvectors is a somewhat roundabout way of doing this. There are more direct ways to do this, including an entirely mechanical algorithm for “splitting” a degenerate conic. – amd Aug 24 '19 at 20:41
2

If you already know that your equation can be factored, you can also solve the equation for one of the variables (e.g. $x$) to find two solutions $x_1$ and $x_2$ (which depend on $y$) and then you have by the factor theorem:

$$ ax^2+2hxy+by^2+2gx+2fy+c=a(x-x_1)(x-x_2). $$

EXAMPLE.

If your equation is: $$ x^2-2xy-8y^2+4x+2y+3=0 $$ collect x: $$ x^2-2(y-2)x-8y^2+2y+3=0 $$ then solve for $x$: $$ x=(y-2)\pm\sqrt{(y-2)^2+8y^2-2y-3} =(y-2)\pm(3y-1)= \cases{4y-3\cr-2y-1} $$ and finally: $$ x^2-2xy-8y^2+4x+2y+3=(x-4y+3)(x+2y+1). $$

Intelligenti pauca
  • 50,470
  • 4
  • 42
  • 77
1

If you’re clever or lucky, you can spot how to factor the equation into a product of linear terms. That’s not likely outside of artificially-constructed exercises and exam questions. If the general equation does in fact represent a pair of lines, they are the common asymptotes of the family of hyperbolas obtained by varying $f$ in the equation. Indeed, the asymptotes can be considered the degenerate member of this one-parameter family. Several methods to find these asymptotes can be found in the answers to this related question and others. In Perspectives on Projective Geometry, Richter-Gebert gives an algorithm for “splitting” a degenerate conic, which I’ll reproduce briefly here.

First, it might be good to verify that the equation does in fact represent a pair of lines. Writing the equation in matrix form as $$\mathbf x^TQ\mathbf x = \begin{bmatrix}x&y&1\end{bmatrix} \begin{bmatrix}a&h&g\\h&b&f\\g&f&c\end{bmatrix} \begin{bmatrix}x\\y\\1\end{bmatrix} = 0,$$ examine $S = \det Q \text{ and } \Delta = \det\begin{bmatrix}a&h\\h&b\end{bmatrix} = ab-h^2$: if $\Delta\lt0$, the equation represents a hyperbola, and if $S=0$, it is degenerate—a pair of lines. The matrix $Q$ is then, up to an irrelevant constant factor, a rank-two matrix of the form $lm^T+ml^T$, where $l$ and $m$ are homogeneous coordinate vectors that represent the two lines. The algorithm finds a skew-symmetric matrix $M$ such that $Q+M$ is a rank-one matrix of the form $ml^T$, from which both lines can be read directly.

It turns out that if $p$ is the intersection point of the lines and $\mathcal M_p$ its skew-symmetric “cross-product” matrix, then there is some real $\alpha$ for which the matrix $Q+\alpha\mathcal M_p$ has rank one. The intersection point $p$ is the center of the conic, which can be found using any of several standard methods. Once you have this point, form the matrix $Q+\alpha\mathcal M_p$ and find an $\alpha$ for which all of the $2\times2$ minors vanish. This will involve solving a straightforward quadratic equation in $\alpha$.

That isn’t quite the algorithm Richter-Gebert presents, but when you’re doing this calculation yourself, it’s can be more convenient than his actual algorithm:

  1. $B = Q^{\tiny\triangle}$.
  2. Let $i$ be the index of a nonzero diagonal entry of $B$.
  3. $\beta = \sqrt{B_{i,i}}$ (multiply $B$ by $-1$ if necessary so that $\beta$ is real).
  4. $p=B_i/\beta$, where $B_i$ is the $i$th column of $B$.
  5. $C=Q+\mathcal M_p$
  6. Let $(i,j)$ be the index of a nonzero element $C_{i,j}$ of $C$.
  7. $l$ is the $i$th row of C; $m$ is the $j$th column of $C$.

Here $Q^{\tiny\triangle}$ is the adjugate of $Q$, i.e., the transpose of its cofactor matrix. (Since $Q$ is symmetric, this is equal to its cofactor matrix.) Applying this algorithm to the general equation, we get $$C = \begin{bmatrix}a & h-\sqrt{h^2-ab} & g+{af-gh\over\sqrt{h^2-ab}} \\ h+\sqrt{h^2-ab} & b & f-{bg-fh\over\sqrt{h^2-ab}} \\ g-{af-gh\over\sqrt{h^2-ab}} & f+{bg-fh\over\sqrt{h^2-ab}} & c\end{bmatrix}.$$ It’s a moderately interesting exercise to verify that, with the assumption that the common element is nonzero, every row/column pair of this matrix represents the same pair of lines and generates the original equation. If you try this, you’ll need to use $S=0$ to do some of the necessary simplification.

This algorithm also works when the conic is a pair of parallel lines: $p$ will be a point at infinity in that case. If the conic is a double line, then $Q$ is already a rank-one matrix of the form $mm^T$, which, if you didn’t spot immediately, you will discover after computing $Q^{\tiny\triangle}$: all of the cofactors of a rank-one matrix vanish, so its adjugate is the zero matrix.

Having said all that, when working this by hand, I find it easiest to compute the conic’s center—the intersection point of the lines—and get the lines’ direction vectors by finding nonzero solutions of $ax^2+2hxy+by^2=0$ (which is equivalent to finding the intersections of the hyperbola with the line at infinity). The latter is usually a matter of treating the above equation as a quadratic in one of the variables and setting the other variable to some convenient value.

amd
  • 53,693