Given a sequence $\{\frac{(-1)^n}{n}\}$ show directly from the definition that it converges to $0$.
Definition of convergence of a sequence is:
A sequence $\{p_n\}$ converges if for every $\epsilon>0$ there exists an $N\in\mathbb N$ so that $n\geq N \implies d(p_n,p)<\epsilon$
My approach
What we want to show is:
For every $\epsilon>0$ there exists an $N\in\mathbb N$ so that $$n\geq N \implies d(\frac{(-1)^n}{n},0)<\epsilon$$
After this step the solution I have stops making sense
Take $N$ so large that $N>\frac2\epsilon$. (No idea why $2$ is chosen? And why this works) Then $n\geq N \implies | \frac{(-1)^n}n | < \frac2n< \frac2N < \epsilon $ (No idea why this is true or logically progresses)
Edit:
It could be 2/n because $| \frac{(-1)^n}n | =| \frac{1}n |$ we need at least 2 so that $| \frac{1}n |< 2/n$