I understand that matrix is a linear transformation of a vector space and matrix multiplication is applying one transformation over another. I couldn't get a geometric intuition as to why multiplying matrix by its transpose give covariance matrix.
Asked
Active
Viewed 310 times
0
-
Is this post helpful to you? – Anton Vrdoljak Oct 26 '23 at 12:29
-
I am still not convinced why it has anything to do with variance? – Jay Oct 26 '23 at 12:35
-
Ok. Did you saw another post as well? – Anton Vrdoljak Oct 26 '23 at 12:41
-
Yes. It is not what I am looking for. I want to understand why variance comes into the picture. – Jay Oct 26 '23 at 12:46
-
Any visual/geometric interpretation is helpful – Jay Oct 26 '23 at 12:46
-
Every symmetric positive semi definite matrix is a covariance matrix and every matrix of the form $A^\top A$ is obviously symmetric and PSD. – Kurt G. Oct 26 '23 at 12:51
-
@Jay Also it's not clear what you mean by a "geometric intuition" as to something yielding a covariance matrix. Having a geometric explanation would require a geometric representation of a covariance matrix or perhaps the concept of covariance, but it's not clear what that would be – Ben Grossmann Oct 26 '23 at 14:38
-
@Jay A correction to my earlier comment: can you understand that if the rows of an $m \times n$ matrix $A$ have mean zero, then $\frac 1{m-1} AA^T$ gives the usual estimate for covariances based on each row of samples? – Ben Grossmann Oct 26 '23 at 14:45