I want to solve $e^{-x} < \frac{1}{x^2 + 1}$, assuming $x \in \mathbb{R}^+$. This can be solved via finding the solutions of $e^x>x^2+1$. This seems less troublesome, but I've found this inequality is very hard as well. I've tried everything I could think of, from logarithmic approaches to remove the exponential function to expressing $e^x$ as $\sum_{k=0}^\infty \frac{x^k}{k!}$. This last approach renders the inequality $-x^2 +\sum_{k=3}^\infty \frac{x^k}{k!} > 0$, which isn't any less tricky.
A last, desperate approach was to prove the last inequality via induction for $x \in \mathbb{N}$ and see if that could be extended to $\mathbb{R}$ somehow. Needless to say such long shot failed. How would one go about solving this inequality?
PD: This inequality is just a particular case of a more general inequality $e^{-\frac{|x|}{\alpha}}<\frac{\alpha}{x^2+\alpha}$, with $\alpha \in \mathbb{R}^+$. The problem presented above is this inequality under the assumptions $\alpha = 1, x \in \mathbb{R}^+$. Any thoughts on this more general, and far more complicated, problem are also welcomed -though are not my main interest.
Thanks in advance.