Often in contour integrals, we integrate around a singularity by putting a small semicircular indent $\theta \rightarrow z_0 + re^{i\theta}$, $0 \leq \theta \leq \pi$ around the singularity at $z_0$.
Then one claims that the integral "picks up half a residue" as $r \rightarrow 0$, so we compute the residue, divide by two, and multiply by $2\pi i$ to get the limiting value of the integral over the small semicircle.
I don't see how to justify this rigorously. I tried adapted the proof of the residue theorem, which involves expanding Taylor series, but this crucially relies on the fact that the path is closed.
I also tried using seeing if the values of the function on the semicircle all tend to the same value as the semicircle shrinks to zero. But I'm not sure if this is even true, since we don't really have nice behavior of the function at $z_0$.
So my question is this. Under what circumstances can we claim that the integral over a small semicircle centered at a pole $z_0$ is $\pi i$ times the residue at $z_0$, and how do we prove this?