Let $n\ge 2$ and $1\leq k<n$. Show or disprove that $(1-k/n)^n < e^{-k} < (1-k/n)^{n-1}$ for all large enough n.
The first inequality is true.
According to several of the proofs below, the second inequality actually doesn't hold for sufficiently large n.
Method 1)
Taking logarithms, and multiplying all inequalities by $-1$ we get $k > -(n-1)\ln (1-k/n).$ Using a Taylor series expansion (valid for $k < n$), we get $k < (n-1) \sum_{i=1}^\infty \dfrac{(k/n)^i}i.$ The latter inequality holds since $k/(n-1) = (k/n)/(1-1/n) = \sum_{i=1}^\infty k(1/n)^i,$ and this is less than $\sum_{i=1}^\infty (k/n)^i/i$.
Method 2)
We have $k/n < \sum_{i=1}^\infty \dfrac{(k/n)^i}i$ as $(k/n)^i>0$ for all $i\ge 0$.
We need to show that for large enough $x>k$, $k/(x-1) < -\ln (1-k/x)\Leftrightarrow k/(x-1) + \ln (1-k/x) > 0.$ Let $f(x) = k/(x-1) + \ln (1-k/x)$. Then $f'(x)=-k(\dfrac{1}{x^2-2k+1} - \dfrac{1}{x^2-xk}) = -k(x^2-xk - (x^2-2k+1))/((x^2-xk)(x^2-2k+1)),$ which exceeds zero for all $x>2.$ $f(x)$ actually tends to $0$ as $x\to \infty$ and $f(x)\to -\infty$ as $x\to k^+$. So if $f$ is eventually strictly increasing, then since it tends to $0$ as $x\to\infty,$ $f$ should be less than zero for sufficiently large $x$.
$$...$$
instead of$...$
) when appropriate; this also prevents the use of ugly\dfrac
. – metamorphy Nov 12 '22 at 05:09