One way to get the equality is to evaluate the expected value of the unbiased estimator $s^2$. I use n instead of m and $ \overline X$ for the (unbiased) estimator of $\mu$.
$E(s^2)=E\left[\frac{1}{n-1}\sum_{i=1}^n (X_i-\overline X )^2\right]$
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\overline X)^2 \right] \quad | \pm \mu$
$=\frac{1}{n-1}E\left[\sum_{i=1}^n \left[(X_i-\mu)-(\overline X-\mu) \right]^2 \right] \quad$
multipliying out
$=\frac{1}{n-1}E\left[\sum_{i=1}^n \left[(X_i-\mu)^2-2(\overline X-\mu)(X_i-\mu)+(\overline X-\mu)^2 \right]\right] \quad$
writing for each summand a sigma sign
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\mu)^2-2(\overline X-\mu)\sum_{i=1}^n(X_i-\mu)+\sum_{i=1}^n(\overline X-\mu)^2 \right] \quad$
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\mu)^2-2(\overline X-\mu)\color{blue}{\sum_{i=1}^n(X_i-\mu)}+n(\overline X-\mu)^2 \right] \quad$
transforming the blue term
$\sum_{i=1}^n(X_i-\mu)=n\cdot \overline X-n\cdot \mu$
Thus $2(\overline X-\mu)\color{blue}{\sum_{i=1}^n(X_i-\mu)}=2(\overline X-\mu)\cdot (n\cdot \overline X-n\cdot \mu)=2n( \overline X- \mu)^2$
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\mu)^2-2n( \overline X- \mu)^2+n(\overline X-\mu)^2 \right] \quad$
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\mu)^2-n( \overline X- \mu)^2\right] \quad$
$=\frac{1}{n-1}\left[\sum_{i=1}^n E\left[(X_i-\mu)^2\right]-nE\left[( \overline X- \mu)^2\right]\right] \quad$
We know, that $E\left[(X_i-\mu)^2\right]=\sigma^2$ and $E\left[( \overline X- \mu)^2\right]=\sigma_{\overline x}^2=\frac{\sigma^2}{n}$ Thus we get
$=\frac{1}{n-1}\left[n \cdot \sigma ^2-n \frac{\sigma ^2}{n}\right]=\frac{1}{n-1} \sigma^2 \cdot (n-1)=\boxed{\sigma ^2=E(s^2)}$
If you would calculate $E(\hat \sigma)= E\left[\frac{1}{n}\sum_{i=1}^n (X_i-\overline X )^2\right]$ the result is almost the same but you have to multiply the result by n to neutralize $\frac{1}{n}$ and divide it by $\frac1{n-1}$
$E(\hat \sigma)=\frac{n}{n-1}\cdot \sigma^2$