3

The symmetric matrix A below has eigenvalues 3 and 6 (multiplicity 2). Find an orthonormal basis B of R3 consisting of eigenvectors of A.

\begin{bmatrix}5&1&-1\\ 1&5&1\\ -1&1&5 \end{bmatrix}

so far I have tried $3I-A$ that has given me: \begin{bmatrix}1\\-1\\1\end{bmatrix}

$6I-A$: $$\begin{bmatrix} 1\\1\\0\end{bmatrix}, \begin{bmatrix} 1\\0\\1\end{bmatrix}$$

I am not sure how to continue with this questions

Christian
  • 2,439
  • 1
    I'm assuming these vectors you found are the eigenvectors, so you now have a basis of eigenvectors. Do you know an algorithm to turn a basis into an orthonormal basis? Have you heard of the Gram-Schmidt Process? – Christian Mar 15 '17 at 16:44
  • yes they are the corresponding eigenvectors. I am not too sure of the Gram-Schmidth process – Solange Ogaz Mar 15 '17 at 16:50

1 Answers1

1

If a (real) matrix is symmetric (more generally, if a complex matrix is normal), then eigenvectors corresponding to distinct eigenspaces are orthogonal (for a good explanation, see this post here).

(Note: you should check over your computation of the eigenvectors, because your second eigenvector with eigenvalue $6$ is not orthogonal to the eigenvector with eigenvalue $3$, and is in fact not even an eigenvector. I think it should be:

$$\begin{bmatrix} -1 \\0 \\1\end{bmatrix}$$

The Gram Schmidt Process is an algorithm which turns a basis into an orthogonal basis, the details of the algorithm can be found here, or in any standard linear algebra textbook.

So, running this algorithm on each of your bases will produce orthogonal bases for each eigenspace, and since eigenvectors corresponding to distinct eigenspaces are orthogonal (your matrix is symmetric), concatenating the bases will form an orthonormal basis of eigenvectors.

Since this case is small enough, we really don't even need to use Gram-Schmidt, we just need to find two vectors in the span of the eigenvectors with eigenvalue $6$ that are orthogonal, and they'll be orthogonal to the one with eigenvalue $3$.

I set up $$\bigg(a\begin{bmatrix} 1\\1\\0\end{bmatrix} + b\begin{bmatrix}-1\\0\\1\end{bmatrix}\bigg) \cdot \bigg(c\begin{bmatrix} 1\\1\\0\end{bmatrix} + d\begin{bmatrix}-1\\0\\1\end{bmatrix}\bigg) =0$$

and pretty quickly found choices of $a, b, c, d$ that work.

Christian
  • 2,439
  • Seems like the eigenvectors should be the columns of $V$

    $$ V = \begin{bmatrix} 1 & 1 & -1 \ -1 & 1 & 1 \ 1 & 0 & 2 \ \end{bmatrix}. $$

    – T L Davis Mar 16 '17 at 00:05
  • @TLDavis I'm not quite sure what you mean. Is $V$ your vector space? If so, then it is not a matrix. If you mean that these eigenvectors should be the elements of your basis, then, yes, the columns of the matrix that you found do indeed form an orthonormal basis of eigenvectors. – Christian Mar 16 '17 at 14:54
  • Yes, that’s what I meant. In particular, note the last column. I don't think $[-1 ,, 0 ,, 1]^T$ Is a good eigenvector. – T L Davis Mar 17 '17 at 02:09
  • @TLDavis It is a perfectly good eigenvector (Applying A to it returns $-6e_1+ 6e_3$), but it isn't orthogonal to the others, if that's what you mean. I found that vector in computation of the eigenspace, and my answer indicates that the Gram Schmidt process should be applied (or brute force) to the basis of eigenvectors with eigenvalue 6 ($-e_1 +e_3$, and the other one of the OP's) to form an orthogonal basis. I wasn't intending to provide the basis, but a method the OP could use to find the answer. I have that eigenvector to point out a mistake made by OP, as $e_1+e_3$ is not an eigenvector. – Christian Mar 17 '17 at 02:40
  • But I thought the eigenvectors of a real symmetric matrix were orthogonal. See the entry under 'Linked' questions for a proof. Seems like if you normalized the columns of $V$, then that would be the basis $B$. – T L Davis Mar 17 '17 at 02:55
  • @TLDavis Eigenvectors of a real symmetric matrix with distinct eigenvalues are orthogonal. And, for a real symmetric matrix, there exists an orthonormal basis of eigenvectors (this is the Spectral Theorem). It's easy to come up with examples of nonorthogonal eigenvectors with the same eigenvalue. Take the Identity and any two nonorthogonal vectors, for instance. I'm confused at what you're getting at. Yes, as I said, the columns of your $V$ form an orthogonal basis of eigenvectors. (I mistakenly said orthonormal which is not the case, but they are orthogonal). – Christian Mar 17 '17 at 14:20