0

Consider the sequence $a_n = \left( 1+ \frac{1}{n} \right)^{n + k}$, where $ 1\leq n \in \mathbb{N}$. Then the sequence is decreasing for $k \geq \frac{1}{2}$, and increasing for $k < \frac{1}{2}$.

How could one prove this? I tried considering the ratio $\frac{a_{n+1}}{a_n}$ and it's position relative to 1 by taking logarithms, but I haven't had much luck. I was able to prove that the sequence is decreasing for $k=\frac{1}{2}$ by using Bernoulli's inequality and a binomial approximation, but this method doesn't seem to generalise to arbitrary $k$.

Wuestenfux
  • 20,964
Tanny Sieben
  • 2,471
  • Seems that the derivatives could help, by considering $f(x) = (1+1/x)^{x+k}$. – xbh Nov 02 '18 at 14:50
  • 1
    A related question was asked yesterday. You might be able to modify the proof in the accepted answer here: https://math.stackexchange.com/questions/2979085/any-elementary-proof-of-the-monotonicity-of-a-n-1-frac1nn-frac1?noredirect=1&lq=1 – Umberto P. Nov 02 '18 at 14:51
  • I wasn't able to really generalise that proof. I still think it is based on derivatives, which reduces to $\log{1 + \frac{1}{n}} \geq \frac{\alpha + x}{x + x^2}$ or the other inequality, depending on the $\alpha$. – Tanny Sieben Nov 03 '18 at 20:38

0 Answers0