Let $(x_n)$ be a sequence in R. Show that if $\lim_{n \to \infty} x_n = 0$ if follows that $\lim_{n \to \infty} (1+\frac{x_n}{n})^n = 1$
My idea looks like the following (using the binomial theorem): $$(1+\frac{x_n}{n})^n = \sum_{k=1}^{n} {{n}\choose{k}} (\frac{x_n}{n})^k = \sum_{k=1}^{n} \frac{n!}{k! \cdot (n-k)!} (\frac{x_n}{n})^k=\sum_{k=1}^{n} \frac{n \space \cdot \space ... \space \cdot \space (n-k+1)}{k!} (\frac{x_n}{n})^k$$
How do I proceed from here? Am I somehow supposed to split the sum up and then take the limit? Can someone help me out? Thanks in advance!