Unfortunately the answer is: subdivide and go on. The rule-of-signs-predicate is not able to tell whether there are any roots, and if you have no additional means to do so, you have no other option.
However, there are theorems which tell you that this won't happen too often. Short summary of the "easy" cases: If you count the sign variations $v$ of the polynomial which "describes" the interval $[a,b]$, then
- $v$ will be 0 if no (complex) root of the input polynomial is contained in the disc in the complex plane with diameter $ab$, and
- $v$ will be 1 if only one (complex) root of the input is contained un the union of the circumcircles of the two equilateral triangles that have $ab$ as one side, assuming that this root is simple.
There are more detailed versions of these theorems, but in a nutshell it boils down to: you will count the right thing unless there is a cluster of complex roots close to the interval (w.r.t. the scale/precision you are currently considering), and you have to "zoom in" to deblur and resolve the situation.
However, if you have reasons to believe that there is a wide range without any roots, note that "subdivide" is not necessarily the same as "bisect". You are free to choose other subdivision methods; this eventually leads to fancier algorithms like Continued Fraction solver (VAS; praised in practice, at least for some benchmarks, but their actual merit is disputed) or combinations of Newton iteration and Descartes (see the recent publications by Michael Sagraloff and his colleagues; disclaimer: I'm working in Michael's group).