0

I was reading this Why is the eigenvector of a covariance matrix equal to a principal component?. And in the top answer, the poster mentions that if the covariance matrix of the original data points $x_i$ was Σ, the variance of the new data points is just $u^TΣu$. I'm not sure why this is the case. Could someone enlighten me?

Eagle1992
  • 519
  • 2
  • 6
  • 18

1 Answers1

2

I Suppose your data set consists of N vectors, each of length T. Let's denote them $\vec{x}_i$, (i=1, ..., N). The new data set consists of $\vec{y}_i$ (i=1, ..., N), $\vec{y}_i = \sum_{k=1}^N p_{ik} \vec{x}_k$.

Then the covariance matrix of the new data set has elements \begin{eqnarray*} C_{ij} &=& {1 \over T^{2/\alpha}} \vec{y}_i \cdot \vec{y}_j \\ &=& {1 \over T^{2/\alpha}} (\sum_{k=1}^N p_{ik} \vec{x}_k) \cdot (\sum_{l=1}^N p_{jl} \vec{x}_l) \\ &=& \sum_{k=1}^N \sum_{l=1}^N p_{ik} p_{jl} C'_{kl} \\ C &=& P C' P^T \end{eqnarray*} where P is the matrix with components $p_{ik}$ and C' is the covariance matrix of $\vec{x}_i$.

Sinbaski
  • 186