I found this problem in the mean value theorem section in a real analysis book.
I did not know, how to use the mean value theorem but I tried to find $f'$ and I found
$$\frac{1}{(x+1)^{2}}<\frac{1}{1+x}<1.$$
Clearly, $\frac{1}{(x+1)^{2}}$decrease faster than$\frac{1}{1+x}$, when $x$ grow larger.
What I will do next if I want to start from this, or is there another way to prove it?