You can get a direct proof by noting that
$$
\sum_{1 \leq i < j \leq n} (x_i - x_j)^2 = (n-1) \sum_{1 \leq i \leq n} x_i^2 - 2 \sum_{1 \leq i < j \leq n} x_i x_j = n \sum_{1 \leq i \leq n} x_i^2 - \left(\sum_{1 \leq i \leq n} x_i\right)^2.
$$
Another possibility is given a probabilistic interpretation of user236182's proof. Let $X$ be a random variable ranging uniformly on the multiset $\{x_1,\ldots,x_n\}$. The variance of $X$ is
$$
V[X] = E[X^2] - E^2[X] = \frac{1}{n} \sum_{i=1}^n x_i - \frac{1}{n^2} \left(\sum_{i=1}^n x_i \right)^2.
$$
Yet another possibility is to minimize the right-hand side while keeping the left-hand side fixed. Suppose that $y + \delta \leq z - \delta$ for some $\delta \geq 0$. Then
$$
(y + \delta)^2 + (z - \delta)^2 = y^2 + z^2 + 2\delta (y - z + \delta) \leq y^2 + z^2.
$$
Let $\mu$ be the average of $x_1,\ldots,x_n$. If the vector $x_1,\ldots,x_n$ is not constant then we can always find some $i,j$ such that $x_i < \mu < x_j$. If $\mu \leq (x_i + x_j)/2$ then $x_i + (\mu - x_i) \leq x_j - (\mu - x_i)$, and so if we replace $x_i,x_j$ by $\mu,x_i+x_j-\mu$, not changing the left-hand side and not increasing the right-hand side. The case $\mu > (x_i + x_j)/2$ can be handled similarly. As a result, we have changed the values of $x_1,\ldots,x_n$ in a way that doesn't change the left-hand side and doesn't increase the right-hand side, and in addition increases the number of $x_i$s equal to $\mu$.
Repeating this operation at most $n-1$ times, we reach the constant $\mu$ vector, for which the inequality trivially holds (both sides equal $(n\mu)^2$). Looking back at the original inequality, its right-hand side cannot be any smaller than $(n\mu)^2$ (since our modification can't have increased it), and we deduce the inequality.