I stumbled upon this problem that requires the residue theorem, but there's something I'm not sure I understand. Let $x\in R /(0)$, I want to use the residue theorem to calculate: $$f(x):=\int_R{ 1\over [(x-t)^2+1](t^2+1)}dt$$
Now, the way I would do this is finding and classifying the singularities, then taking an arbitrary circumference containing them and using the theorem as I normally would. However, in this case the solution claims that the "condition of decay" is met, and without adding anything else it only applies the residue theorem to the singularities in the upper half-plane. There is no trace of this condition in my notes, so I'm not exactly sure what that means. The only decay I can see is that for $t$ large the function inside the integral goes to zero, but I still can't grasp the logic behind it. When am I supposed to only consider the positive (in a complex sense) singularities, as opposed to all of them?
Thank you.