Let us assume that $s_1,\dots,s_n$ are some given non-negative integers. Let $M=\max\{s_1,\dots,s_n\}$ and for any $k$ let use denote $c_k=|\{i=1,\dots,n; s_i\ge k\}|$. (In the other words, $c_k$ is the number of elements of the given set which have size at least $k$.) Then $$\sum\limits_{i=1}^n \sum\limits_{j=1}^n \min\{s_i,s_j\}=\sum_{k=1}^M c_k^2.\tag{*}$$ (Notice that $c_0=n$, however the term $c_0^2$ is not included on the RHS.)
How can we prove this?
This equality seems to be rather elementary (and I have included a proof by induction below). Still, I suppose that there are many interesting ways to prove this, so I thought that it might be reasonable to ask here to see various approaches to this sum and various ways to formulate the arguments why this equality holds.
Motivation. This expression appears naturally in some considerations about commuting matrices. See, for example, Robert Israel's answer to Dimension of a subspace of $M_n(\mathbb C)$. (The sum here is the formula he mentions in the last paragraph, in the case when there is only a single eigenvalue.)
Example. Just to make sure that we are not trying to prove something which does not hold, let's have a look at least at one example. (After all, having some examples in front of us might be also useful when thinking about proofs.)
Let us try the numbers $1$, $1$, $2$, $3$, $3$, i.e., $s_1=s_2=1$, $s_3=2$, $s_4=s_5=3$.
We can write down the following table, where we have $\min\{s_i,s_j\}$ in the position $(i,j)$ and the last row/column contain the sums. $$ \begin{array}{ccccc|c} 1 & 1 & 1 & 1 & 1 & 5 \\ 1 & 1 & 1 & 1 & 1 & 5 \\ 1 & 1 & 2 & 2 & 2 & 8 \\ 1 & 1 & 2 & 3 & 3 &10 \\ 1 & 1 & 2 & 3 & 3 &10 \\\hline 5 & 5 & 8 &10 &10 &38 \end{array} $$ At the same time we get $c_1^2+c_2^2+c_3^2=5^2+3^2+2^2=25+9+4=38$.
A special case. In particular, if we take $s_i=i$, then we get the sum $$\sum\limits_{i=1}^n \sum\limits_{j=1}^n \min\{i,j\}=\sum_{k=1}^n (n-k+1)^2 = \sum_{k=1}^n k^2 = \frac{n(n+1)(2n+1)}6.\tag{⋄}$$ This sum seems to me as a rather natural exercise to give students for manipulation with sums and for trying out combinatorial proofs.
I have tried searching whether there is some question on this site about $(\diamond)$. I did not find such question - the closest one I've seen was this one: How prove this identity $\sum\limits_{i=1}^{n}\sum\limits_{j=1}^{n}\min{\{i,j\}}(a_{i}-a_{i+1})(b_{j}-b_{j+1})=\sum\limits_{i=1}^{n}a_{i}b_{i}$?
EDIT: After searching a bit more I found Combinatorial proof of $\sum_{1\le i\le n,\ 1\le j\le n}\min(i,j)=\sum_{i=1}^ni^2=\frac{n(n+1)(2n+1)}6$ and A Combinatorial proof for the identity $\sum_i \sum_j \min(i,j) = \sum_k k^2$
Proof by induction. There are certainly various more clever proofs, but as a starting point we can try induction on $M$.
$1^\circ$ If $M=0$, then both sides of $(*)$ are equal to zero.
$2^\circ$ Let us assume that the claim is true whenever we have number with maximum less than $M$.
W.l.o.g. we can assume that $s_i\ge1$ for each $i$. (If suffices to notice that by including also $s_i$'s for which $s_i=0$ we add zero to both sides. The RHS is not influenced at all, since such $s_i$'s do not count in $c_k$ for $k\ge1$. On the LHS we are only adding/omiting the terms of the form $\min\{0,s_i\}=0$.)
With this assumption we can use $s'_i=s_i-1$. We can then apply the induction hypothesis on the numbers $s'_i$. By noticing that $\min\{s_i,s_j\}=\min\{s'_i,s'_j\}+1$ and $c_k=c'_{k=1}$ we get \begin{align*} S &=\sum\limits_{i=1}^n \sum\limits_{j=1}^n \min\{s_i,s_j\} \\ &=\sum\limits_{i=1}^n \sum\limits_{j=1}^n (1+\min\{s'_i,s'_j\}) \\ &=n^2+\sum\limits_{i=1}^n \sum\limits_{j=1}^n \min\{s'_i,s'_j\} \\ &=n^2+(c'_1)^2+\dots+(c'_{M-1})^2= \\ &=m^2+c_2^2+\dots+c_M^2= \\ &=c_1^2+c_2^2+\dots+c_M^2 \end{align*}