My solution to this problem is as follows:
Let $\epsilon_n$ be a sequence of numbers that converge to $0$. Let $c = \epsilon_n x$, where $x \in \Re$. Then, using the limit definition of $e^x$ we can write:$$\lim_{n \to \infty}(1 + \frac {c}{n})^n = e^c$$ As $n \to \infty$, $c \to 0$ and $e^c \to 1$. $$\therefore \lim_{n \to \infty}(1 + \epsilon_n\frac {x}{n})^n = 1$$
Is this solution valid? I'm not sure if my use of the limit definition of $e^x$ is correct here.