Suppose we have some numbers $x_1, x_2, \ldots, x_n$. By shuffling them around we can assume that they are ordered such that $x_1 \leq x_2 \leq \ldots \leq x_n$.
The mean of these numbers is given by
$$
\bar{x} = \frac{1}{n} \sum_{i=1}^n x_i.
$$
It is easy to prove that $\bar{x}$ lies between the smallest number $x_1$ and the largest $x_n$ since
$$
x_1 = \frac{1}{n} \sum_{i=1}^n x_1 \leq \frac{1}{n}\sum_{i=1}^n x_i \leq \frac{1}{n}\sum_{i=1}^n x_n = x_n.
$$
As you mention, $\bar{x}$ is a number with small distance to each individual point.
The precise way in which this is true, is that $\bar{x}$ minimizes the expression
$$
\sum_{i=1}^n (x_i - \bar{x})^2.
$$
One way to prove this, is to note that this expression is a quadratic in the variable $\bar{x}$, so it's unique minimum will be at the point where its derivative is zero. Setting the derivative to zero gives
$$
-2\sum_{i=1}^n x_i+2n\bar{x}=0,
$$
so that we indeed find that
$$
\bar{x} = \frac{1}{n} \sum_{i=1}^n x_i.
$$
Edit: In response to your comment, indeed it turns out that $\bar{x}$ minimizes the squared differences $\sum_{i=1}^n (x_i - \bar{x})^2$ instead of $\sum_{i=1}^n |x_i - \bar{x}|$, which you might have initially expected. It turns out that the minimizer of $\sum_{i=1}^n |x_i - \bar{x}|$ is the median, which you have probably heard of. After some searching, I found this answer which gives a precise argument that the median minimizes this expression. It turns out, however, that the median is often non-unique.
For example, in the case of two numbers $x_1 < x_2$, any $\bar{x} \in [x_1,x_2]$ minimizes $\sum_{i=1}^2 |x_i - \bar{x}|$. The fact that the squared differences always have a unique, easily expressible minimizer is convenient, but which measure is appropriate depends on the application.
It might also be helpful to know that in the context of machine learning, the sum of the squared differences is called the $\ell^2$-loss and the sum of the absolute value differences the $\ell^1$-loss.