2

I was looking at a post regarding the orthogonality of the eigenvectors of a symmetric matrix, and wanted to see if the following statement is true, and why?

A matrix is symmetric if and only if its eigenspaces are orthogonal.

Is this true and why?

mdcq
  • 1,658

2 Answers2

5

In this form it is not exactly true. For example, we can take a matrix that has no eigenspaces at all, like $$ A=\left(\begin{array}{ll}0&1\\-1&0\end{array}\right). $$ It is not symmetric, and technically all of its eigenspaces are orthogonal ))

However, the following is true: a real $n\times n$ square matrix $A$ is symmetric if and only if all of its eigenspaces are orthogonal and the sum of these eigenspaces is the whole $\mathbb{R}^n$. This condition is equivalent to saying that there is an orthonormal basis consisting of eigenvectors of $A$, and this is the statement from the post that you mentioned.

UPDATE: If you're interested in the same question but for matrices over $\mathbb{C}$ and using orthogonality that arises from the standard inner product $(a,b) = \sum a_i \overline{b}_i$, then the statement isn't true. The counterexample is the same matrix $A$ as above. It is not symmetric and not Hermitian, but it has two eigenspaces generated by vectors $(1, i)^T$ and $(1, -i)^T$ which are orthogonal.

Dan Shved
  • 15,862
  • 39
  • 55
  • What do you mean by "technically"? I left the word "real" out of the original question because that was the part I wasn't really sure about - no pun intended. So my second question is why is that true? – Robert S. Barnes Oct 31 '12 at 08:29
  • Well, you can safely omit the word "technically". I just mean that the set of all eigenspaces is empty, therefore it is true that for any distinct elements $U$ and $V$ of this set $U$ is orthogonal to $V$. There are other counterexamples, I see one of them in the comments to your question, where the set of eigenspaces isn't empty. Maybe these examples look more convincing than mine. – Dan Shved Oct 31 '12 at 08:35
  • 2
    @Robert: Suppose that $A$ has no eigenspaces. Then it’s automatically true that if $x$ and $y$ are vectors in different eigenspaces of $A$, then $x\cdot y=0$. However, it’s true on a technicality: it’s true because there are no such $x$ and $y$, and an implication with a false premise is automatically true. – Brian M. Scott Oct 31 '12 at 08:35
  • As for the word "real" - without it I can't answer your question, because then it isn't immediately clear to me what the word "orthogonal" would mean. – Dan Shved Oct 31 '12 at 08:35
  • 1
    @DanShved: Are you sure you don't want to use the standard inner product on $\mathbb{C}^n$? – wj32 Oct 31 '12 at 08:39
  • @wj32: Well, I wasn't sure that we were talking about $\mathbb{C}$ and not some other field. As for $\mathbb{C}$, I don't mind using the standard inner product, but it looks like then the statement from the original post isn't true at all. Maybe it could be true with some other notion of orghogonality. – Dan Shved Oct 31 '12 at 08:49
  • 1
    @DanShved: What I meant was in the case of $\mathbb{C}$, you could immediately "upgrade" your statement by performing the following substitutions: real -> complex; symmetric -> Hermitian; $\mathbb{R}^n$ -> $\mathbb{C}^n$, and adding a real eigenvalues requirement – wj32 Oct 31 '12 at 08:55
  • So limiting ourselves to $\mathbb{R}^n$, what is the proof? – Robert S. Barnes Oct 31 '12 at 09:02
  • @wj32: You have a point )) – Dan Shved Oct 31 '12 at 09:04
  • 1
    @RobertS.Barnes: If a matrix $M$ has orthogonal eigenspaces that sum to $\mathbb{R}^n$, then choose an orthonormal basis for each eigenspace, resulting in an orthonormal basis for $\mathbb{R}^n$ and an orthogonal matrix $P$. We have $M=PDP^T$ where $D$ is diagonal, so $M^T=PD^T P^T=PDP^T=M$. For the other direction, see http://en.wikipedia.org/wiki/Spectral_theorem#Hermitian_maps_and_Hermitian_matrices – wj32 Oct 31 '12 at 09:08
  • So does the same proof work over $\mathbb{C}^n$ if we use the Hermitian inner product? – Robert S. Barnes Oct 31 '12 at 09:38
  • 1
    @RobertS.Barnes: Yes. As I mentioned before, you need to add in the condition that all eigenvalues must be real. – wj32 Oct 31 '12 at 10:35
  • @wj32 So the statement should be modified to be that a matrix over $\mathbb{C}$ is symmetric iff it has real eigenvectors and it's eigenspaces are orthogonal, correct? – Robert S. Barnes Nov 03 '12 at 17:56
  • 1
    @RobertS.Barnes: Yes, but you need to replace "symmetric" with "Hermitian". – wj32 Nov 03 '12 at 20:56
3

A definition of "symmetric" which should prove useful here is: if < , > is a canonically chosen inner product, we say T is symmetric if and only if for all vectors $u,v \in V$ we have < u, Tv> = < Tu, v>.

Assume that the dimension of V is equal to the sum of the dimensions of the eigenspaces $E_1,...,E_k$ associated to the eigenvalues $\lambda_1,...,\lambda_k$. Then we can write any vectors $u,v \in V$ as $\sum_{i=1}^k u_i$ and $\sum_{i=1}^k v_i$ respectively. < u, Tv> can then be rewritten <$\sum u_i$, T $\sum v_i$> = < $\sum u_i$, $\sum T v_i$> = < $\sum u_i$, $\sum \lambda_i v_i$>.

Now we use orthogonality. If the eigenspaces are orthogonal to eachother, then this expression becomes $\sum$ < $u_i$, $\lambda_i v_i$> = $\sum \lambda_i$< $u_i$, $v_i$> = $\sum$ < $\lambda_i u_i$, $v_i$> = < $\sum \lambda_i u_i$, $\sum v_i$> = < Tu, v>.

The reverse implication isn't hard, simply consider the equality < u, Tv> = < Tu, v> when u, v are two eigenvectors corresponding to two different eigenvalues.

  • What does $=.$ mean? – Robert S. Barnes Oct 31 '12 at 09:51
  • So the concept of symmetry in matrices is tied to the definition of the inner product. So you can only have symmetric matrices over inner product spaces? Over any inner product space? – Robert S. Barnes Oct 31 '12 at 09:53
  • 1
    @RobertS.Barnes: The notion of a symmetric matrix does not require an inner product to be defined. However, if you have an inner product space (over $\mathbb{R}$ or $\mathbb{C}$) then by definition a self-adjoint (i.e. "symmetric") operator $T$ is one that satisfies $\langle u, Tv \rangle = \langle Tu,v \rangle$ for all $u,v$. Without a traditional inner product, there are various things you could do. See this question for more details: http://math.stackexchange.com/questions/58146/transpose-of-a-linear-mapping – wj32 Oct 31 '12 at 10:45
  • What are $u_i$ and $v_i$? – mdcq Apr 20 '18 at 17:05