Consider the sequence $a_n = \left( 1+ \frac{1}{n} \right)^{n + k}$, where $ 1\leq n \in \mathbb{N}$. Then the sequence is decreasing for $k \geq \frac{1}{2}$, and increasing for $k < \frac{1}{2}$.
How could one prove this? I tried considering the ratio $\frac{a_{n+1}}{a_n}$ and it's position relative to 1 by taking logarithms, but I haven't had much luck. I was able to prove that the sequence is decreasing for $k=\frac{1}{2}$ by using Bernoulli's inequality and a binomial approximation, but this method doesn't seem to generalise to arbitrary $k$.