Filling in some details left out of the comments by @Michael Hardy, @zyx, and
myself, suppose $\vec{X} = (X_1, \ldots, X_N)$ where the $X_i, 1 \leq i \leq N,$ are independent $N(\mu, v)$ random variables with known mean $\mu$ and unknown variance $v$. The joint density function is
$$f_{\vec{X}}(\vec{x}) = (2\pi v)^{-N/2}\exp(-a/v) ~\text{where}~ \vec{x} = (x_1, \ldots, x_N)~ \text{and}~ a = \frac{1}{2} \sum_{i=1}^N (x_i - \mu)^2.$$
If $\vec{X}$ is observed to have value $\vec{x}$, the likelihood function is
$L(v) = (2\pi v)^{-N/2}\exp(-a/v)$ and it is easy to show that $L(v)$
attains maximum value at
$$v = \frac{2a}{N} = \frac{1}{N}\sum_{i=1}^N (x_i - \mu)^2$$
and so the maximum likelihood estimator (MLE) of $v$ is $\frac{1}{N}\sum_{i=1}^N (x_i - \mu)^2$. We have
$$
E\left [ \frac{1}{N}\sum_{i=1}^N (X_i - \mu)^2 \right ]
= \frac{1}{N}\sum_{i=1}^N E[(X_i - \mu)^2] = \frac{1}{N}\sum_{i=1}^N v = v,
$$
and thus, contrary to any alleged claims in an unspecified textbook in the possession of Nikos, the MLE for $v$ is unbiased in this instance.
What if $\mu$ is also unknown? The likelihood function is now
$$
L(\mu, v)
= (2\pi v)^{-N/2}\exp\left [ -\frac{1}{2v}\sum_{i=1}^N (x_i - \mu)^2\right ]
$$
and has a global maximum at
$$
(\mu, v) = \left (\frac{1}{N}\sum_{i=1}^N x_i,
\frac{1}{N}\sum_{i=1}^N \left (x_i -\frac{1}{N}\sum_{i=1}^N x_i \right)^2 \right )
= \left ( \bar{x}, \frac{1}{N}\sum_{i=1}^N (x_i -\bar{x})^2 \right ).$$
The MLE for $v$ is thus $\frac{1}{N}\sum_{i=1}^N (x_i -\bar{x})^2$
and is biased since
$$
E\left [ \frac{1}{N}\sum_{i=1}^N \left (X_i -\frac{1}{N}\sum_{i=1}^N X_i \right)^2 \right ] = \left(\frac{N-1}{N}\right)v
$$
but, as noted by Nikos's textbook, the MLE for $v$ is asymptotically
unbiased in the limit as $N \to \infty$. On the other hand, it should
be obvious from the above description that
$\frac{1}{N-1}\sum_{i=1}^N (x_i -\bar{x})^2$ is an unbiased
estimator for $v$ for all $N \geq 2$.