4

I read in the following paper (https://iopscience.iop.org/article/10.1088/0026-1394/41/3/004/pdf) on page 133 "The conditional standard deviation (...) is necessarily an underestimate of its unconditional standard deviation." . I am trying to understand why this statement is true.

I will now outline my understanding of this in 3 parts.

Part 1: In this link (Proving that Sample Variance is an unbiased estimator of Population Variance), a proof is given that shows the sample variance is an unbiased estimator of the population variance:

$$E(S^2) = \frac{n-1}{n}E(X_1-Y_1)^2 = \frac{n-1}{n}\text{var}(X_1-Y_1) = \frac{n-1}{n}\left(\sigma^2 + \frac{\sigma^2}{n-1}\right) = \sigma^2$$

Part 2: In this link (https://stats.stackexchange.com/questions/496424/how-to-prove-s2-is-a-consistent-estimator-of-sigma2), a proof is given that shows the sample variance is a consistent estimator of the population variance:

\begin{align*} &\mathbb{P}(\mid s^2 - \sigma^2 \mid > \varepsilon )\\ &= \mathbb{P}(\mid s^2 - \mathbb{E}(s^2) \mid > \varepsilon )\\ &\leqslant \dfrac{\text{var}(s^2)}{\varepsilon^2}\\ &=\dfrac{1}{(n-1)^2}\cdot \text{var}\left[\sum (X_i - \overline{X})^2)\right]\\ &=\dfrac{\sigma^4}{(n-1)^2}\cdot \text{var}\left[\frac{\sum (X_i - \overline{X})^2}{\sigma^2}\right]\\ &=\dfrac{\sigma^4}{(n-1)^2}\cdot\text{var}(Z_n)\\ &=\dfrac{\sigma^4}{(n-1)^2}\cdot 2(n-1) = \dfrac{2\sigma^4}{n-1} \stackrel{n\to\infty}{\longrightarrow} 0 \end{align*}

Thus, $ \displaystyle\lim_{n\to\infty} \mathbb{P}(\mid s^2 - \sigma^2 \mid > \varepsilon ) = 0$ , i.e. $ s^2 \stackrel{\mathbb{P}}{\longrightarrow} \sigma^2 $ as $n\to\infty$ , which tells us that $s^2$ is a consistent estimator of $\sigma^2$ .

Part 3: Using some algebraic manipulation, I can see that the sample variance appears to always be less than the population variance:

The sample variance is defined as:

$$s^2 = \frac{1}{n-1} \sum_{i=1}^n (x_i - \bar{x})^2$$

where $n$ is the sample size, $x_i$ are the individual observations, and $\bar{x}$ is the sample mean and the population variance is denoted as $\sigma^2$.

OLS (Ordinary Least Squares) tell us that $\bar{x}$ minimizes the sum of squared deviations - thus:

$$\sum_{i=1}^n (x_i - \bar{x})^2 \leq \sum_{i=1}^n (x_i - \mu)^2$$

Dividing both sides by $n-1$:

$$\frac{1}{n-1} \sum_{i=1}^n (x_i - \bar{x})^2 \leq \frac{1}{n-1} \sum_{i=1}^n (x_i - \mu)^2$$

Further simplifying and substituting (for large enough $n$):

$$\sigma^2 = \frac{1}{n} \sum_{i=1}^n (x_i - \mu)^2$$

$$\frac{1}{n-1} \sum_{i=1}^n (x_i - \bar{x})^2 \leq \frac{1}{n-1} n\sigma^2 = \sigma^2$$

$$s^2 = \frac{1}{n-1} \sum_{i=1}^n (x_i - \bar{x})^2 \leq \sigma^2$$

This proves that the sample variance $s^2$ is always less than or equal to the population variance $\sigma^2$. On another note, using informal logic, an argument can be made that the sample variance might not include extreme outliers, whereas the population would include these extreme outliers. Extreme outliers have large deviations from the mean - thus, the presence of extreme outliers would increase the variance calculations. Thus, the sample variance can technically never be smaller than the population variance.

My Question: How can all 3 parts simultaneously be correct at the same time? If the sample variance is said to estimate the population variance without any bias, the sample variance is said to converge to the population variance for large samples - then how can the sample variance always be guaranteed to be less than the population variance? Is this not a contradiction?

Thanks!

References:

stats_noob
  • 3,112
  • 4
  • 10
  • 36
  • 1
    “The sample variance is always less than the population variance.” This is not true. – Andrew Jun 22 '23 at 06:15
  • Your "proof" on point 3 is wrong on many levels. The essence of a sample is to have a finite size; then you somehow make a leap for "large enough $n$" which is totally not allowed in the context you are trying to prove. Then, the following equality is simply not true (if not by chance) for any finite $n$. When you make a statement like that, I suggest to build some examples and see if it holds for those. For instance, you could just generate some random numbers from a standard normal with your preferred sw and you'll see that sometimes sample variance is bigger than 1. – nicola Jun 22 '23 at 07:41
  • @ nicola: thank you for your reply! this was exactly the kind of analysis I was looking for! – stats_noob Jun 22 '23 at 11:20
  • 2
    You claim that $\frac{1}{n-1} n\sigma^2 = \sigma^2$. The only way this can be true is if $\sigma^2=0$. On the other hand, you are using $n$ for both the size of a sample and the size of the population in the same "proof". But the sources you cite are discussing standard deviation, which is the square root of variance and therefore has different statistical properties. Did you even bother to read these sources before citing them? – David K Jun 27 '23 at 05:41

0 Answers0