Is there a neat method for solving equations of the form: $$\frac {a^2}{x^2}-\frac {b^2}{(1-x)^2}=c^2$$ where $0<x<1$, by exploiting the symmetry in instead of expanding it as a fully-blown quartic?
Perhaps one could use the substitution $y=\frac 12-x$, but it doesn't seem to go far.
Additional Note: The original equation has $$\begin{align} a^2&=u^2\mu^2\\ b^2&=v^2(1-\mu)^2\\ c^2&=\frac {v^2-u^2}{\gamma^2}\end{align}$$ where $0<\mu <1$. Note sure if this additional information helps.