For $f(x)$, defined on the interval $X$, $f(x)$ is uniformly continuous in X if and only if for every sequences $x_{n}$, $y_{n}\in X$, when we have $\lim_{n\rightarrow\infty}(x_{n}-y_{n})=0$, then $\lim_{n\rightarrow\infty}[f(x_{n})-f(y_{n})]=0$.
For this characterization, I am confused with three point.
It is shown on my textbook that, for $f(x)=x^2$, defined in $[0,+\infty)$, $x_{n}=\sqrt {n+1}$, $y_{n}=\sqrt{n}$ were chosen. I wonder why not to choose some apparent sequence such as $x_{n}=1/n$, $y_{n}=1/2n$.
In my opinion, I think that $1/n$ and $1/2n$ are not defined on zero point.
For function $f(x)=1/x$, defined in $(\xi,1)$, $0\lt\xi\lt1$), I wonder why it's unwarranted to use $x_{n}=1/n$, $y_{n}=1/2n$.
As far as I am concerned, we need to confirm $\lim_{n\rightarrow\infty}[f(x_{n})-f(y_{n})]$ and if $x_{n}$ is defined in $(\xi,1)$ then $n$ should be in $(1,1/\xi)$. Finally, $n$ can't tend to $\infty$.
I noticed that this characterization mention that for all $x_{n}$, $y_{n}$. What if we want to prove uniform continuity for certain function? How can we choose all suited sequences?
thanks.