(Context) This is a long question, so please bear with me. Pondering about the integral
$$\int_{0}^{\infty}\frac{1-\frac{1}{x^{2}}}{\left(x+\frac{1}{x}\right)^{2}-1}dx$$
that was asked in this question, one of the questions was if letting $u = x+\frac{1}{x}$ was valid. I answered,
"No, because on the interval $(0,\infty)$, the function $x + \frac{1}{x}$ decreases then increases, making it not injective over that interval. You need an interval of $u$'s such that each $x \in (0,\infty)$ matches with one and only one $u$."
Then I basically said they can get away with a piecewise-monotonic situation such that they can break up the integral into $\int_0^1$ and $\int_1^{\infty}$.
However, after clicking on this post, I read,
"However, integration by substitution does not generally/inherently require substitution functions to be invertible/bijective or monotonic—or even injective—on the interval of integration."
That was somewhat shocking to me. I originally thought a substitution required to be injective over the interval of integration, but now I'm wondering if my answer is incorrect.
(Attempt) Trying to see if I messed up, I read in my textbook, "An Introduction to Analysis (3rd ed.)" by William R. Wade, this theorem (page 130):
Let $\phi$ be continuously differentiable on a closed, nondegenerate interval $[a,b]$. If $f$ is continuous on $\phi([a,b])$, or if $\phi$ is strictly increasing on $[a,b]$ and $f$ is integrable on $[\phi (a), \phi (b)]$, then
$$\int_{\phi (a)}^{\phi (b)}f(t)dt = \int_{a}^{b}f(\phi (x))\phi'(x)dx.$$
With that theorem in mind, I experimented with a simple integral:
$$\int_{0}^{3\pi /2}\sin\left(\cos\left(x\right)\right)\left(-\sin\left(x\right)\right)dx.$$
Intuitively, you can let $\phi=\cos{(x)}$ without any problem. Thinking about the theorem above, it doesn't satisfy the second condition because it's not strictly increasing everywhere on $[0,3\pi/2]$. Then I thought, "but I think it satisfies the first condition "$f$ is continuous on $\phi([a,b])$." So I concluded I can do the substitution with no problem.
(Question) For the improper integral in question, can I let $u = x+\frac{1}{x}$ where $x \in (0,\infty)$? It's definitely not strictly increasing everywhere on that interval, so the second condition in the theorem gets thrown out the window. But does the first condition hold? If so, why exactly? Originally, I thought it doesn't hold because $x+\frac{1}{x}$ is not defined at $x=0$, but I can replace $0$ with $\epsilon \to 0$ and replace $\infty$ with $t \to \infty$, which throws me off even more.
P.S. I'm somewhat of a beginner at real/complex analysis, so if you could, please add as many details as possible.