I am trying to identify what the flaw is exactly when reasoning about a limit such as the definition of $\mathbf e$:
$$ \lim_{n\rightarrow \infty}\left(1+\frac{1}{n}\right)^n={e} $$
Now, I know there are ways of proving this limit, such as by considering the binomial expansion of $(1+\frac{1}{n})^n$ and comparing that to the Maclaurin series of $e$. So to make this clear, I am not looking for a proof of the limit definition of $e$.
I tried searching for "limit laws/rules" but none of the rules I found described the above case. Hence, I am looking for a specific rule (or perhaps a certain perspective) that will help me realize why the above does not in fact evaluate to 1.
My train of thought is as follows: at a first glance, if I wasn't already familiar with what the limit evaluates to, I probably would evaluate the expression inside the brackets first, and then apply the limit to the power. So, $$\displaystyle\lim_{n\to\infty}\frac{1}{n}=0\quad\text{and then}$$ $$\displaystyle\lim_{n\rightarrow \infty}\left(1+0\right)^n=\lim_{n\to\infty}1^n=1$$
Why is my reasoning flawed?