1

If I have a random mean zero vector X and a covariance matrix $\Sigma = E(XX^T)$, the eigen values of $\Sigma$ are given as $\lambda_1..._d$, how would I represent the covariance matrix in terms of its eigenvalues and eigenvectors? Also what is the relation between the covariance matrix and variance?

wasp
  • 13
  • Diagonalize $\Sigma$. You can do this, since it's a real symmetric matrix. See https://en.wikipedia.org/wiki/Diagonalizable_matrix, https://math.stackexchange.com/questions/357340/importance-of-eigenvalues/357368#357368, ... – Eman Yalpsid Oct 01 '16 at 13:54

1 Answers1

0

The covariance matrix is symmetric, so has an orthonormal eigenvector basis by the spectral theorem. To recover $\Sigma$ from normalised column-eigenvectors and eigen-values, just form a matrix $P$ by writing your eigenvectors in order and a diagonal matrix D with the eigenvalues along the diagonal, in the corresponding order to P. Since P is orthonormal, $P^{-1}=P^t$ and $\Sigma = P^{-1}DP$. Also see https://en.wikipedia.org/wiki/Matrix_decomposition#Eigendecomposition. The only statistical fact used here is that $\Sigma^t=\Sigma$, everything else is general linear algebra.

Re terminology: Some authors call $\Sigma$ the variance because it is the reasonable generalisation of uni-variate variance to multivariate distributions: The variance of each component of your probability Vector sits on the diagonal of $\Sigma$, while covariances are off-diagonal - in the univariate case the 1-by-1 "Covariance matrix" is trivially diagonal, so just equal to the variance. (See "Conflicting nomenclatures and notations" on https://en.wikipedia.org/wiki/Covariance_matrix)