7

I read in this answer that:

If covariance matrix is $\Sigma$, the covariance after projecting in $u$ is $u^T \Sigma u$.

I fail to see this, how do I get the covariance of a set of points after projecting those points along the direction $u$ as a function of $u$ and $\Sigma$ ?

  • 3
    More generally, if $X\in\mathbb{R}^n$ and $Y\in\mathbb{R}^m$ are random vectors and $\operatorname{cov}(X,Y)=\Sigma\in\mathbb{R}^{n\times m}$, and $A\in\mathbb{R}^{k\times n}$ and $B\in\mathbb{R}^{m\times\ell}$ are constant (i.e. non-random) matrices, then $\operatorname{cov}(AX,BY)=A\Sigma B^T\in\mathbb{R}^{k\times \ell}$. More tersely, $\operatorname{cov}(AX,BY)=A(\operatorname{cov}(X,Y))B^T$. – Michael Hardy Jul 23 '12 at 01:21

1 Answers1

12

The covariance matrix for a vector quantity $x$ is $\langle xx^\top\rangle-\langle x\rangle\langle x^\top\rangle$. The covariance for the projection $u^\top x$ is

$$\langle u^\top xx^\top u\rangle-\langle u^\top x\rangle\langle x^\top u\rangle=u^\top\langle xx^\top\rangle u-u^\top\langle x\rangle\langle x^\top\rangle u=u^\top\left(\langle xx^\top\rangle-\langle x\rangle\langle x^\top\rangle\right)u\;.$$

The point is basically that you can pull $u$ out of all the expectation values because it's a constant.

joriki
  • 238,052
  • 1
    By $\langle . \rangle$ do you mean expectation? i.e. when you say $\langle xx^\top\rangle$ do you mean $E\left[xx^\top\right]$ ? – Amelio Vazquez-Reina Jul 23 '12 at 00:58
  • 2
    @roseck: Yes.${}$ – joriki Jul 23 '12 at 01:41
  • @user815423426: The $<u,v>$ denotes the inner product. In this case it is a normal dot-product between vectors. – raphexion May 08 '13 at 09:45
  • 2
    @user2155919: No. a) The notation uses angled brackets $\langle\cdot\rangle$ (which you can produce using \langle and \rangle, respectively), not less/greater symbols $\lt\cdot\gt$. b) You rightly placed a comma between $u$ and $v$ in the notation for the inner product; note that there are no commas in my post. c) I had already replied to the OP's question that their interpretation of the angled brackets as denoting expectation was correct. The dot products are not explicitly reflected in the notation and arise through the matrix multiplication implied by juxtaposition. – joriki May 08 '13 at 10:16
  • @joriki Isn't $u^Tx$ a scalar? Shouldn't the projection of a vector $x$ onto a unit vector $u$ be $(u^Tx)x$? – Shobhit Jan 06 '16 at 16:23
  • 1
    @Shobhit: I'd inferred from the OP's formulation "the covariance after projecting in $u$ is $u^T \Sigma u$" that the term "projection" as used in the question refers to the scalar length of what you're referring to as the "projection" (since otherwise it wouldn't have a scalar covariance). As far as I'm aware, both of these uses of the term "projection" are in common use. – joriki Jan 07 '16 at 11:43
  • @joriki makes sense. However, allow me to be nit-picky by pointing out that your answer seems to use the formula for covariance of a vector $x$, but $u^\top x$ is a scalar. You should rather use $Cov(s) = \langle s^2 \rangle - \langle s \rangle\langle s \rangle$ for a scalar $s$, which again reduces to $\langle u^\top xx^\top u\rangle-\langle u^\top x\rangle\langle x^\top u\rangle$ for $u^\top x$ because $(u^\top x)^2=(u^\top x)(x^\top u)$ – Shobhit Jan 07 '16 at 14:56