Standard deviation is often taken from a one-dimensional list of numbers. However, how would one evaluate standard deviation of a set of points of $N$-dimensional coordinates?
-
3Covariance matrices contain information about variability in each component as well as dependence between components. – angryavian Dec 29 '20 at 04:56
-
@angryavian: I would say it contains information on correlation between components instead of the more general dependence concept – tommik Dec 29 '20 at 05:11
-
Does this answer your question? Correct Intuition? Standard Deviation and distance in $n$ dimensional space. – cngzz1 Dec 29 '20 at 05:14
-
Are you sure this isn't a tensor instead of a matrix in the more general case? Or if there's two coordinates I would use 2x2 matrices? – PhiEarl Dec 29 '20 at 05:16
-
@PhiEarl Covariance matrices only contain information about pairwise dependence between components, so it is an $N \times N$ matrix. For higher-order dependence involving more than two variables, I suppose the generalization would involve higher-order moments and would result in a tensor. – angryavian Dec 29 '20 at 05:19
2 Answers
It's defined as $\sqrt{|\Sigma|}$ where $|\Sigma|$ is the determinant of the variance-covariance matrix, a $N\times N$ matrix with the single variances on the main diagonal and covariances elsewhere

- 32,733
- 4
- 15
- 34
Suppose $X$ is a random point in $\mathbb R^{n\times1}.$ Then $\mu = \operatorname E(X) \in \mathbb R^{n\times1}$ is defined componentwise and then $$ \Sigma = \operatorname{var}(X) = \operatorname E\big((X-\mu)(X-\mu)^\top\big) \in \mathbb R^{n\times n} $$ is an $n\times n$ nonnegative-definite symmetric matrix.
It can be shown via the (finite-dimensional version of the) spectral theorem that $\Sigma$ has a nonnegative-definite symmetric square root $\Sigma^{1/2}$ (definitely not defined componentwise).
If $A\in\mathbb R^{k\times n}$ then $$ \operatorname{var}(X) = A\Sigma A^\top \in\mathbb R^{k\times k}. $$ From this it follows that if $\operatorname{var}(Y) = I_n$ (the $n\times n$ identity matrix) and $\Sigma$ is any nonnegative-definite symmetric real matrix, then $\Sigma^{1/2} Y$ has variance $\Sigma,$ and therefore every nonnegative-definite symmetric real matrix is a variance.
If $\Sigma$ is invertible then it can be shown that a random vector whose density is $$ \frac 1 {\sqrt{2\pi}^{\,n}} \frac 1 {\big(\det\Sigma^{1/2}\big)} \exp\left( - \frac 1 2 (x-\mu)^\top \Sigma^{-1} (X-\mu) \right) $$ has expected value $\mu$ and variance $\Sigma.$