I will surmise that what you mean is the variance of the mean of $n$ random variables, and each has $k$ possible values.
$$\text{It's } \frac{\sigma_1^2 + \cdots + \sigma_n^2} n + \frac{(\mu_1-\overline \mu)^2 + \cdots +(\mu_n-\overline \mu)^2} n \text{ where } \overline\mu = \frac{\mu_1 + \cdots + \mu_n} n.$$
Let $x_{i,1},\ldots ,x_{i,k}$ be the possible values of the $i$ random variable for $i=1,\ldots,n.$ Then
\begin{align} \require{cancel}
& \mu_i = \frac{x_{i,1} + \cdots + x_{i,k}} k \\[10pt]
& \sigma_i^2 = \frac{(x_{i,1} - \mu_i)^2 + \cdots + (x_{i,k} - \mu_i)^2} k \\[10pt]
& \sigma_i^2 + (\mu_i-\overline \mu)^2 = \frac 1 k \sum_{j=1}^k \big( (x_{i,j} - \mu_i)^2 + (\mu_i - \overline\mu)^2 \big) \\[10pt]
= {} & \frac 1 k \sum_{j=1}^k \big( (x_{i,j} - \mu_i)^2 - 2(x_{i,j} - \mu_i) (\mu_i - \overline\mu) + (\mu_i - \overline\mu)^2 \big) \tag 1 \\
& \textbf{Why? See below.}
\end{align}
This last equality is true because
\begin{align}
& \sum_{j=1}^k \big( (x_{i,j} - \mu_i) (\mu_i - \overline\mu) \big) \\[8pt]
= {} & (\mu_i-\overline \mu)\sum_{j=1}^k (x_{i,j} - \mu_i)\\
& \text{since $\mu_i-\overline\mu$ does not change as $j$ goes from $1$ to $k$} \\[10pt]
= {} & (\mu_i-\overline\mu)\cdot0.
\end{align}
Then line $(1)$ above becomes
\begin{align}
& \frac 1 k \sum_{j=1}^k \big( (x_{i,j} - \mu_i)^2 - 2(x_{i,j} - \mu_i) (\mu_i - \overline\mu) + (\mu_i - \overline\mu)^2 \big) \\[8pt]
= {} & \frac 1 k \sum_{j=1}^k \big( (x_{i,j} - \mu_i) - (\mu_i - \overline\mu) \big)^2 = \frac 1 k \sum_{j=1}^k (x_{i,j} - \overline \mu)^2.
\end{align}