0

Let $S$ be the sample standard deviation, based on a random sample of size $n$ from a distribution with pdf $f(x;\mu;\sigma ^2)$ with mean $\mu$ and variance $\sigma^2$

a) Show that $E(S) \le \sigma$, where equality holds iff $f(x;\mu;\sigma^2)$ is degenerate at $\mu$, $P[X = \mu] = 1$. Hint: Consider $Var(S)$

b) If $X_{i} \sim N(\mu,\sigma^2)$ find a constant $c$ such that $cS$ is an unbiased estimator of $\sigma$. Hint: Use the fact that $\frac{(n-1)S^2}{\sigma^2} \sim \chi^2(n-1)$ and $S = (S^2)^{\frac{1}{2}}$

For part a) here is what I did

$$ s^2 \le \sigma^2$$ $$ s^2 = E(s^2) - [E(s)]^2 \ge 0$$ we use the fact that $$Var(S) =0$$ $$[(S- \mu_{x})^2] = 0$$ $$S = \mu_{x}$$

so $$E(s^2) \ge [E(S)]^2$$ $$\sigma^2 \ge E(S)$$

They are equal when $S=\sigma=0$, x is degenerate.

Can anyone check my answer for part a) and help me with part b)?

  • https://math.stackexchange.com/q/946013/321264, https://math.stackexchange.com/q/1059938/321264, https://math.stackexchange.com/q/2777526/321264 – StubbornAtom Jun 27 '20 at 18:03

1 Answers1

1

(a)

You have that $S = \sqrt{\frac{1}{n-1}\sum_{i=1}^N(x_i - \bar{x})^2}$.

Taking the expectation of this we get $E[S] = E[\sqrt{\frac{1}{(n-1)}\sum_{i=1}^N(x_i - \bar{x})^2}]$. The square root is a concave function, so you can use Jensen's inequality to upper bound the expectation as:

\begin{equation} \begin{split} E[S] & = E[\sqrt{\frac{1}{(n-1)}\sum_{i=1}^N(x_i - \bar{x})^2}]\\ & \leq\sqrt{E[\frac{1}{(n-1)}\sum_{i=1}^N(x_i - \bar{x})^2}]\\ & = \sqrt{\sigma^2}\\ & = \sigma \end{split} \end{equation}

We also have that $Var[S] = E[S^2] - E[S]^2$. Note that $S^2$ is unbiased for the variance, thus $E[S^2] = \sigma^2$. By assumption we have that $E[S] = \sigma$, which gives us that $E[S]^2 = \sigma^2$. This implies that $Var[S] = 0$.

Using Markov's inequality we have:

\begin{equation} P[S\geq \sigma]\leq \frac{E[S]}{\sigma}\Leftrightarrow P[S\geq \sigma]\sigma \leq E[S] \end{equation}

This bound can only give equality of $E[S]$ and $\sigma$ when $P[S\geq \sigma] = 1$.

Examining $P[S\leq \sigma] = 1- P[S\geq \sigma] \leq 1 - \frac{E[S]}{\sigma}$

To show the distribution is degenerate we can do the following:

  1. Specify $S_n'^{2} = \frac{1}{n-1}\sum_{i=1}^n(x_i - \mu)^2$. It can be shown that this is unbiased and has variance bounded above by $Var[S^2]$, which means that $Var[S_n'^{2}]=0$ as well.
  2. Set $A_i = ((X_i - \mu)^2 - \sigma^2)$, note that $\frac{1}{n}\sum_{i=1}^n A_i = S_n'^{2} - \sigma^2$
  3. By Chebyschev's inequality: $\forall k \in \mathbb{N}^+$ and $\forall n \in \mathbb{N}^+$ $P[|S_n'^{2} - \sigma^2|\geq k* 0]\leq \frac{1}{k^2}$

Note that if we examine $A_i = S_1'^{2} - \sigma^2$ Chebyschev's inequality still holds. Using the Borel Cantelli lemma on the events $\{A_i = I_{(|((X_i - \mu)^2 - \sigma^2)|\geq 0)}\}$. From (3) we have that the probability of these events are bounded above by $\frac{1}{k^2}$ for all $k$. Specify:

\begin{equation} \sum_{i=1}^\infty P[A_i] < \sum_{i=1}^\infty \frac{1}{i^2} < \infty \end{equation}

So by the BC lemma $(X - \mu)^2 - \sigma^2\overset{a.s.}{=}0$.

Re-arranging this equation gives us $X\overset{a.s.}{=}\sigma+\mu$, but because $X$ has a point mass distribution $\sigma = 0 $ which gives $X\overset{a.s.}{=}\mu$

(b) We have that $\frac{(n-1)S^2}{\sigma^2} \sim \chi^2_{n-1}$. Note that the expectation on the right is $(n-1)$. Dividing both sides by Dividing both sides by $\frac{(n-1)}{\sigma^2}$ gives: \begin{equation} S^2 \sim \frac{\sigma^2}{(n-1)}\chi^2_{n-1} \Rightarrow S \sim \frac{\sigma}{\sqrt{n-1}}\chi_{n-1} \end{equation}

And by by linearity of expectation under scalar multiplication the right hand side has the appropriate expectation. This suggests that if we start with an unstandardized $S$, we should transform it by $\frac{1}{\sqrt{n-1}}$: getting from $S \rightarrow S'$ where $S' = \frac{S}{\sqrt{n-1}}$.

This gets us from $S^2 \sim \sigma^2\chi^2_{n-1}\Rightarrow \frac{n-1}{n-1}S = (n-1)S'^2 \sim \sigma^2\chi^2_{n-1}$.

$\Rightarrow S'^2 \sim \sigma^2\chi^2_1$ which has the appropriate expectation.

Ryan Warnick
  • 1,664
  • Wow I did not expect the proof for part a) to be so long! Thank you Ryan for the help! – Itsnhantransitive Dec 30 '17 at 23:47
  • For part b) you have to find a constant $c$ such that $cS$ is an unbiased estimator of $\sigma$ . Where is value of $c$? – Itsnhantransitive Dec 30 '17 at 23:49
  • There are probably a lot of ways to solve the first problem. I tried a couple of things, but I'm not super happy with how long the answer is.

    In part b), the answer is $\frac{1}{\sqrt{n}}$ because you need to multiply $\frac{S^2}{\sigma^2}$ by a complex form of $1$ ($\frac{n-1}{n-1}$) to make the estimator unbiased. After dividing out the numerator you're left with $\frac{1}{n-1}$, which after taking the square root gives you $\frac{1}{\sqrt{n-1}}$.

    – Ryan Warnick Dec 31 '17 at 00:40