(a)
You have that $S = \sqrt{\frac{1}{n-1}\sum_{i=1}^N(x_i - \bar{x})^2}$.
Taking the expectation of this we get $E[S] = E[\sqrt{\frac{1}{(n-1)}\sum_{i=1}^N(x_i - \bar{x})^2}]$. The square root is a concave function, so you can use Jensen's inequality to upper bound the expectation as:
\begin{equation}
\begin{split}
E[S] & = E[\sqrt{\frac{1}{(n-1)}\sum_{i=1}^N(x_i - \bar{x})^2}]\\
& \leq\sqrt{E[\frac{1}{(n-1)}\sum_{i=1}^N(x_i - \bar{x})^2}]\\
& = \sqrt{\sigma^2}\\
& = \sigma
\end{split}
\end{equation}
We also have that $Var[S] = E[S^2] - E[S]^2$. Note that $S^2$ is unbiased for the variance, thus $E[S^2] = \sigma^2$. By assumption we have that $E[S] = \sigma$, which gives us that $E[S]^2 = \sigma^2$. This implies that $Var[S] = 0$.
Using Markov's inequality we have:
\begin{equation}
P[S\geq \sigma]\leq \frac{E[S]}{\sigma}\Leftrightarrow P[S\geq \sigma]\sigma \leq E[S]
\end{equation}
This bound can only give equality of $E[S]$ and $\sigma$ when $P[S\geq \sigma] = 1$.
Examining $P[S\leq \sigma] = 1- P[S\geq \sigma] \leq 1 - \frac{E[S]}{\sigma}$
To show the distribution is degenerate we can do the following:
- Specify $S_n'^{2} = \frac{1}{n-1}\sum_{i=1}^n(x_i - \mu)^2$. It can be shown that this is unbiased and has variance bounded above by $Var[S^2]$, which means that $Var[S_n'^{2}]=0$ as well.
- Set $A_i = ((X_i - \mu)^2 - \sigma^2)$, note that $\frac{1}{n}\sum_{i=1}^n A_i = S_n'^{2} - \sigma^2$
- By Chebyschev's inequality: $\forall k \in \mathbb{N}^+$ and $\forall n \in \mathbb{N}^+$ $P[|S_n'^{2} - \sigma^2|\geq k* 0]\leq \frac{1}{k^2}$
Note that if we examine $A_i = S_1'^{2} - \sigma^2$ Chebyschev's inequality still holds. Using the Borel Cantelli lemma on the events $\{A_i = I_{(|((X_i - \mu)^2 - \sigma^2)|\geq 0)}\}$. From (3) we have that the probability of these events are bounded above by $\frac{1}{k^2}$ for all $k$. Specify:
\begin{equation}
\sum_{i=1}^\infty P[A_i] < \sum_{i=1}^\infty \frac{1}{i^2} < \infty
\end{equation}
So by the BC lemma $(X - \mu)^2 - \sigma^2\overset{a.s.}{=}0$.
Re-arranging this equation gives us $X\overset{a.s.}{=}\sigma+\mu$, but because $X$ has a point mass distribution $\sigma = 0 $ which gives $X\overset{a.s.}{=}\mu$
(b)
We have that $\frac{(n-1)S^2}{\sigma^2} \sim \chi^2_{n-1}$. Note that the expectation on the right is $(n-1)$. Dividing both sides by Dividing both sides by $\frac{(n-1)}{\sigma^2}$ gives:
\begin{equation}
S^2 \sim \frac{\sigma^2}{(n-1)}\chi^2_{n-1} \Rightarrow S \sim \frac{\sigma}{\sqrt{n-1}}\chi_{n-1}
\end{equation}
And by by linearity of expectation under scalar multiplication the right hand side has the appropriate expectation. This suggests that if we start with an unstandardized $S$, we should transform it by $\frac{1}{\sqrt{n-1}}$: getting from $S \rightarrow S'$ where $S' = \frac{S}{\sqrt{n-1}}$.
This gets us from $S^2 \sim \sigma^2\chi^2_{n-1}\Rightarrow \frac{n-1}{n-1}S = (n-1)S'^2 \sim \sigma^2\chi^2_{n-1}$.
$\Rightarrow S'^2 \sim \sigma^2\chi^2_1$ which has the appropriate expectation.