It is possible that you are supposed to just calculate, on the assumption that the limit exists. That this is a dangerous thing to do can be illustrated as follows. Let
$$x_{n+1}=x_n^2-2, \qquad x_0=17.$$
It is obvious that the limit of the $x_n$ doesn't exist: the $x_n$ climb very rapidly.
However, suppose we are not paying attention, and let the limit be $a$. Then, substituting, we get $a^2-a-2=0$, that is, $(a+1)(a-2)=0$. We maybe reject the root $a=-1$, and come to the horrendously wrong conclusion that the limit is $2$.
We sketch two proofs that everything is nice for our recurrence. The first proof is much more informative, but also more difficult. Assume we start with positive $x_0$ (minor modification takes care of negative).
First proof: Note first that by standard calculus techniques you can show that $\frac{x}{2}+\frac{1}{x}$ reaches, for positive values of $x$, a minimum value of $\sqrt{2}$ at $x=\sqrt{2}$. So apart possibly from $x_0$, all values of $x_n$ will be $\ge \sqrt{2}$.
Now look at $x_{n+1}-\sqrt{2}$. We have
$$x_{n+1}-\sqrt{2}=\frac{x_n}{2}+\frac{1}{x_n}-\sqrt{2}=\frac{x_n^2-2\sqrt{2}x_n+2}{2x_n}=\frac{(x_n-\sqrt{2})^2}{2x_n}.$$
You can see that if $x_n \gt \sqrt{2}$, which happens automatically when $n\gt 2$, and if $x_n-\sqrt{2}\lt 1$, then $x_{n+1}-\sqrt{2}$ is smaller than $\frac{x_n-\sqrt{2}}{2\sqrt{2}}$. Indeed very soon it is much smaller, and $x_n$ gets sucked into $\sqrt{2}$ with extreme rapidity.
It remains to examine what happens when our second estimate $x_1$ is large. This can come about if we choose $x_0$ to be ridiculously small, like $1/1000$, or ridiculously large, like $x_0=2000$.
For this part we proceed informally. Note that if $x_n-\sqrt{2}\gt 1$, then $\frac{1}{2x_n}+\frac{1}{x_n}\lt \frac{1}{2}x_n+\frac{1}{2}$. Thus if $x_n$ is big, then $x_{n+1}$ is about $\frac{1}{2}x_n$. This says that if by poor choice of initial estimate we get $x_1$ large, the estimates will shrink by roughly a factor of $2$ with each iteration. And then, as observed before, once the distance to $\sqrt{2}$ becomes less than $1$, convergence is extremely rapid.
Second proof: Let $f(x)=\frac{x}{2}+\frac{1}{x}$. Note that $x_{n+1}=f(x_n)$ and that $f(\sqrt{2}=\sqrt{2}$. By the Mean Value Theorem, we have
$$\frac{f(x_n)-\sqrt{2}}{(x_n-\sqrt{2}}=f'(c)$$
for some $c$ between $\sqrt{2}$ and $x_n$. But $f'(x)=\frac{1}{2}-\frac{1}{x^2}$. In particular, if $x_n\gt \sqrt{2}$ (which it is after the first iteration, see first proof), then the derivative is between $0$ and $1/2$.
It follows that
$$|x_{n+1}-\sqrt{2}\lt \frac{1}{2}|x_n-\sqrt{2}|,$$
which implies convergence to $\sqrt{2}$, with at least a doubling of accuracy for each iteration after the first.
Remarks: $1.$ Play with initial estimate say $x_0=1.5$, and fool around with your calculator. You will find that $x_3$ is equal to $\sqrt{2}$ to the full display accuracy of your calculator.
When Newton's Method is good, it is very very good.