I am coming from reading the selected answer to this question. I have a question about the following bit:
It’s not hard to show that if the covariance matrix of the original data points $x_i$ was $\Sigma$, the variance of the new data points is just $u^T \Sigma u$.
I have been playing around with projecting two dimensional data for random variables $(x_1,x_2)$ onto the horizontal axis corresponding to $x_1$. As expected, with $u$ being in the direction of the horizontal axis, the result is $u^T \Sigma u = \operatorname{var}[x_1]$ since we are only taking into account $x_1$. However, when setting $u$ to be in direction of the identity line, the result is $$u^T \Sigma u = \operatorname{var}[x_1] + \operatorname{var}[x_1] + 2\operatorname{cov}[x_1,x_2].$$ I don’t know much about statistics and don’t understand how this represents the variance of the data when projected onto the identity line. Is there a more formal proof for why the quoted bit is true? An intuitive explanation of the result would be appreciated as well.
Edit: My question is why $u^T \Sigma u$ is the variance of the new data points as stated in the question linked above.