I was wondering if there is any reasonable way/theory to do calculations with divergent limits of a sequence. I was trying to prove that Euler's constant $$\gamma = \displaystyle{\lim_{n \to \infty}} \left( \displaystyle\sum_{k=1}^{n} \frac{1}{k} - \log(n)\right)$$ is within $\gamma \in [0,1].$ For me, intuitively it should make sense to rewrite $\gamma$ as $$\gamma = \displaystyle\sum_{k=1}^{\infty} f(x) - \displaystyle{\lim_{n \to \infty}}\log(n)$$ where $f: [1, \infty) \to \mathbb{R}_{\geq0}$ is given by $f(x) = \frac{1}{x}$. But this step already doesn't seem to work with the theory I learned in Calculus 1. But if we keep going we could now use the integral test for convergence to bound $\gamma$: $$\gamma \leq 1 + \int_1^\infty \frac{1}{x} dx - \displaystyle{\lim_{n \to \infty}}\log(n) = 1 + \displaystyle{\lim_{n \to \infty}}\log(n) - \log(1) - \displaystyle{\lim_{n \to \infty}}\log(n) = 1. $$
Although again I see why $\displaystyle{\lim_{n \to \infty}}\log(n) - \displaystyle{\lim_{n \to \infty}}\log(n) = 0$ could be a problematic conclusion. But with the same method we can also establish that $$\gamma \geq 0.$$ If there were any way to make sense of this idea I think it would be a neat proof, although I'm aware of the problematic conclusions. Maybe anyone can teach me something new or confirm my concerns.
Thanks in advance!