1

Let $X_1,X_2,\ldots,X_n $ be iid random samples from $N_p(0,Σ)$ and $b = \sum_{i=1}^nc_iX_i$ and $\sum_{i=1}^nc_i^2 = 1$

I am trying to show $b = \sum_{i=1}^nX_iX_i^T-bb^T$ is independent of $bb^T$?

---solution---

$$b = X_1c_1+X_2c_2+\cdots+X_nc_n$$

$$c_1^2+c_2^2+\cdots+c_n^2 = 1$$

$$\sum_{i=1}^nX_iX_i^T = X_1^2+X_2^2+\cdots+X_3^2$$

Can I say $bb^T = c_i^2X_i^2 \to c_1^2+c_2^2+\cdots+c_n^2 = 1$ then $bb^T = X_i^2$

Please can you help me, how can I continue on?

firstt
  • 11

1 Answers1

1

The proof is similar to that of the univariate result

If $X_1,\ldots, X_n$ are iid $N(0,1)$ random variables, then $\sum_{i=1}^n X_i^2- (\sqrt n \bar X)^2$ is independent of $(\sqrt n\bar X)^2$.

Here is an outline for the proof of the $p$-dimensional result: There exists an orthogonal $n\times n$ matrix $A$ of constants whose first row is $c_1,\ldots, c_n$. Therefore $AA^T=A^TA$ and $A_{1,i}=c_i$ for $i=1,\ldots,n$.

Define random $p$-vectors $Z_1,\ldots, Z_n$ by $$ Z_i := \sum_k A_{i,k}X_k. $$ Now verify the following:

  1. The $Z$'s are iid $N_p(0,\Sigma)$, so they have the same joint distribution as the $X$'s.

  2. $Z_1 = b$.

  3. $\sum_{i=1}^n Z_iZ_i^T = \sum_{i=1}^n X_iX_i^T$.

Conclude using (3) and (2) that $$\sum_{i=1}^n X_iX_i^T - bb^T = \sum_{i=1}^n Z_iZ_i^T - Z_1Z_1^T = \sum_{i=2}^n Z_iZ_i^T$$ which by (1) is independent of $Z_1Z_1^T$, which equals $bb^T$ by (2).

grand_chat
  • 38,951