TL;DR: "When $x$ tends to infinity, $4/x$ will be negligible and hence [...]" is not justified algebraic simplification.
We could just as well say: "When $x$ tends to infinity $\sqrt{1-\frac 4x}-1$ is negligible..." That said, just replace $\sqrt{1-\frac 4x}-1$ with any $f(x)$ such that $\lim_{x\to\infty} f(x) = 0$.
With the above adjustments, we could state your reasoning as:
$$\lim_{x\to\infty}f(x) = 0\implies \lim_{x\to\infty} xf(x) = 0.$$
It doesn't take much creativity to come up with counterexample, take $f(x) = \frac 1x$.
This mistake is similar to another common mistake: $\lim_n(1+\frac 1n)^n = \lim_n 1^n = 1$.
You cannot simply ignore the fact that when taking limit of $f(x)g(x)$ that $f$ and $g$ change simultaneously and not independent of one another. The rule that $$\lim_{x\to a} (f(x)g(x)) = \lim_{x\to a} f(x)\cdot \lim_{x\to a} g(x)$$ when the limits on RHS exist is a piece of magic that we take for granted.