By the Rational Root Theorem, any rational root of $p(x) = x^3 + a x^2 + b x + c$ is given by $\pm d$, where $d$ is a factor of $c$. So, if we set $c = \pm 1$, if $p$ has a rational roots it must be $\pm 1$.
On the other hand, if we choose the signs of the coefficients of $p$ to alternate, then Descartes' Rule of Signs gives that all of the roots are positive. So, if we pick the coefficients of $a$ and $b$ and $c = \pm 1$ to alternate, then $x = 1$ is the only possible rational root, and thus $p$ has no rational roots if $p(1) = 1 + a + b + c \neq 0$.
For example, if we take $a = 0, b < 0, c = 1$, then we deduce that $$p(x) = x^3 + b x + 1$$ has no rational roots provided $b \neq -2$. Now, $p(0) = 1$, so if $p(1) < 0$---that is, if $b < -2$, the asymptotic behavior of $p$ and the Intermediate Value Theorem imply that $p$ has three real, irrational roots (one in each of $(-\infty, 0), (0, 1), (1, \infty)$). Notice that this family,
$$\color{#df0000}{\boxed{p(x) = x^3 + b x + 1, \qquad b < - 2}},$$
consists precisely of the reciprocal polynomials of the polynomials in the family in mathlove's good answer.
$x^3-3x+1 = 0$
– Catalin Zara Mar 28 '16 at 19:07