In my physics course, we're covering physical pendulums, and we are to essentially analyze the range of angles within the interval $\left[0, \frac{\pi}{6}\right]$ to show that $\sin\theta \approx \theta$. (I completed my analysis using Desmos.)
After creating our analyses, we are to estimate an angle, in radians, which has an error margin of approximately one percent. The error estimation function I have is $$ E_1(x) = \frac{x - \sin x}{\sin x} \cdot 100, $$ where values of $x$ are in radians. I then rewrote the RHS as $$ E_1(x) = (x\csc x - 1) \cdot 100. $$ So if the error threshold is one percent, I let $E_1(x) = 1$. And so I have been trying to figure out how to solve $$ x \csc x - 1.01 = 0. $$ The best I could do was to graph the function on Desmos to find the roots.
However, I was hoping to get some pointers in the right direction as to how to solve this equation algebraically. While searching the Math Stack Exchange, I happened upon a related question, but the best I could gather is that the approach depends on the type of equation you have.
Is there a purely algebraic approach to solve this equation? Any advice and/or pointers to further reading would be appreciated.