0

If $ Z_1 , Z_2 , .. , Z_n $ is a random sample from a standard normal distibution , then:

$\bar Z$ and $\sum_{i=1}^n (Z_i - \bar Z)^2$ are independent

Karim
  • 35

1 Answers1

0

This question is equal to a basic property of the sample mean and variance from the normal distribution. That is $$ \overline{X} \text{ and } S^2 \text{ are independent random varables}$$ They can be proven as follow, without MGF. Before that, we have to list two important theorems

  1. Let $X_1,...X_n$ be random vectors. Then they are mutually independent random vectors if and only if we have $g_i(x_i)$ such that the joint pdf of $(X_1,...X_n)$ can be written as $f(x_1,...x_n)=\prod g_i(x_i)$
  2. Let $X_1,...X_n$ be independent random vectors and $g_i(x_i)$ be a function only of $x_i$. Then we have $U_i=g_i(X_i)$ are mutually independent as well.

Firstly, note $$ \begin{aligned} S^2 &= \frac{1}{(n-1)}\sum_{i=1}^{n}({X_i-\overline{X}})^2 \\ &= \frac{1}{(n-1)}\left((X_1-\overline{X})^2+\sum_{i=2}^{n}({X_i-\overline{X}})^2\right)\\ &= \frac{1}{(n-1)}\left(\left[\sum_{i=2}^{n}(X_i-\overline{X})\right]^2+\sum_{i=2}^{n}({X_i-\overline{X}})^2\right) \text{ since } \sum_{i=1}^{n}(X_i-\overline{X})=0 \end{aligned} $$ Hence, we constructed $S^2$ as a function of $(X_2-\overline{X},...,X_n-\overline{X})$. We want to show that these variables are independent of $\overline{X}$.

Then, the joint pdf of $X_1,...,X_n$ can be given by $$ f(x_1,...x_n)=\prod f_i(x_i)=\frac{1}{(2\pi)^{n/2}}e^{-(1/2)\sum{x_i^2}},\quad -\infty < x_i < \infty $$ Make a multivariate transformations as follow $$ \begin{aligned} y_1 &= \overline{x},\\ y_2 &= x_2-\overline{x},\\ \cdots\\ y_n &= x_n-\overline{x} \end{aligned} $$ which is a linear transformation with a Jacobian of $1/n$. So we have
$f(y_1,...y_n) $ $$ \large = \frac{n}{(2\pi)^{n/2}}e^ {-\left(1/2\right)(y_ {1}-\sum _ {2}^ {n}y_ {i})^ {2}} e^ {-(1/2)\sum _ {2}^ {n}(y_ {i}+y_ {1})^ {2}}, \infty < y_ {i} < \infty \\ \large = \left[(\frac{n}{2\pi})^{1/2}e^ {\left(-n y_1^2 \right)/2 }\right] \left[ \frac{n^{1/2}}{(2\pi)^{(n-1)/2}}e^ {-\left(1/2\right)[\sum_2^{n}y_i^2 + (\sum_2^n{y_i})^2]} \right], \infty < y_ {i} < \infty $$ Since we find that the joint pdf of $Y_1,...,Y_n$ can factors as $g_i(y_i)$. According to theorem 1, we have $Y_1$ is independent of the others.
Furthermore, regarding theorem 2, we can get $\overline{X}$ is independent to $S^2$

Basically from Statistical Inference of Casella, G. & Berger, R. L.

Jesse
  • 89