5

I am attempting to learn more about the adjacency matrix(graph theory) but given that I have forgotten a lot of linear algebra, I can't seem to know why this is true. Can someone give me a proof?

  • 5
    Google is your friend: http://www.quandt.com/papers/basicmatrixtheorems.pdf – Alex R. Jun 09 '15 at 02:50
  • Because if $A$ is the matrix in question, an $A$ invariant subspace is also $A^T$ invariant. – copper.hat Jun 09 '15 at 02:54
  • Perhaps this helps: http://math.stackexchange.com/questions/82467/eigenvectors-of-real-symmetric-matrices-are-orthogonal – Hans Lundmark Jun 09 '15 at 05:41
  • @Hans Lundmark Ok. So we know "eigenvectors corresponding to a different eigenvalue are orthogonal to each other". But, doesn't the keyword a complete set mean that there are $n$ eigenvalues and one eigenvector corresponding to each of the eigenvalues. I do not think the post says that, but it does say that you can find $n$ orthogonal eigenvectors. – tintinthong Jul 24 '15 at 02:57
  • 1
    There are $n$ eigenvalues if you count with them with multiplicity. If they are all distinct, then you have at once a basis with $n$ orthogonal eigenvectors. The difficult thing to show is that if the matrix happens to have a (say) triple eigenvalue, then there really must be a corresponding eigenspace of dimension three (not just one or two), so that you can get three basis eigenvectors from this eigenspace. This is the point of the answer by user level1807 in the question that I linked to. – Hans Lundmark Jul 24 '15 at 08:11
  • ... and also of the last theorem in the note that Alex R linked to above. – Hans Lundmark Jul 24 '15 at 08:21

1 Answers1

2

If a matrix $A$ is diagonalizable, then its minimal polynomial is $$ m(\lambda)=(\lambda-\lambda_1)(\lambda-\lambda_2)\cdots(\lambda-\lambda_k) $$ where $\lambda_j$ are the distinct eigenvalues.

Conversely, suppose that the minimal polynomial $m$ for a matrix $A$ factors into distinct linear factors, as written above. Then $$ (A-\lambda_l I)\prod_{j\ne l}(A-\lambda_j I)=0. $$ So the non-zero vectors in the range of $\prod_{j\ne l}(A-\lambda_j I)$ are eigenvectors of $A$ with eigenvalue $\lambda_l$. And every vector $x$ can be written as a sum of such vectors because $$ 1 \equiv \sum_{l=1}^{k}\frac{\prod_{j\ne l}(\lambda-\lambda_j)}{\prod_{j\ne l}(\lambda_l-\lambda_j)} $$ and, hence, $$ I = \sum_{l=1}^{k}\frac{1}{\prod_{j\ne l}(\lambda_l-\lambda_j)} \prod_{j\ne l}(A-\lambda_j I) $$ Therefore, a matrix is diagonalizable (equivalently, has a basis of eigenvectors) iff the minimal polynomial for $A$ factors into the product of the distinct linear factors.

A symmetric $A$ has the property that its conjugate transpose $A^{\star}$ is equal to $A$. A symmetric $A$ is a special case of a normal $A$ for which $A^{\star}A=AA^{\star}$. A normal matrix has the property that $$ \|Ax\|^{2}=(Ax,Ax)=(A^{\star}Ax,x)=(AA^{\star}x,x)=\|A^{\star}x\|^{2}. $$ Therefore, $A^{2}x=0$ iff $A^{\star}Ax=0$, which implies $Ax=0$ because $$ 0 = (A^{\star}Ax,x)=(Ax,Ax)=\|Ax\|^{2}. $$ Therefore, the minimal polynomial of a normal $A$ has no repeated factors because $$ (A-\lambda_{k}I)^{2}(\cdots)x = 0 \iff (A-\lambda_{k}I)(\cdots)x =0. $$ (Note: $A-\lambda_{k}I$ is normal iff $A$ is normal.)

Disintegrating By Parts
  • 87,459
  • 5
  • 65
  • 149