0

I was reading this post: eigenvalues by inspection

that is helpful for a symetric 3 x 3 matrix. Like this:

$$ \begin{bmatrix} 3&2&2\\ 2&3&2\\ 2&2&3 \end{bmatrix} $$
$\text{A)}$ The post said that you can establish the eigenvalues by first, subtracting the I matrix from it and you will get three rows of 2's to find it has nullility 2, so I get that the first eigenspace is multiplicity 2. What I don't know is how did he establish that $\lambda$ = 1? Is it simply the difference between the diagonal and the offdiagonals?

For the second $\lambda$ i understand that one.

$\text{B)}$ Now I know this worked because we had only one difference between the diagonal and the off-diagonals of only one, so subtracting the I matrix made all the rows the same, but what if this is not the case such as in the following matrix?

$$ \begin{bmatrix} 7&1&1\\ 1&7&1\\ 1&1&7 \end{bmatrix} $$
it was obvious to my tutor, once again that this was multiplicity 2 for the $\lambda = 6$. How was this established by inspection? The other $\lambda$ for this matrix is 9. So it seems you come to values of the lambdas by either subtracting the off-diagonals from the diagonals, or summing up the columns with these kinds of matrices. But how do you come to see see the multiplicity without pencil and paper?

$\text{C)}$ I was playing around the other night (with some 2x2 matrices, that is), and it seems that if you have a matrix like this

$$ \begin{bmatrix} a&b\\ b&a \end{bmatrix} $$
then you have $\lambda_1 = a-b$ and $\lambda_2 = a + b$. Once again you could look at this as, for $\lambda_1$ (subtracting the offdiagonal from the diagonal) and for $\lambda_2$ (summing the column)
An example would be:

$$ \begin{bmatrix} 2&1\\ 1&2 \end{bmatrix} $$
Which has eigenvalues of 1 and 3?

Does have anybody have an insight into expedient eigenvalue/vector determinations with such matrices? Thanks.

Bucephalus
  • 1,386
  • You can read up on the Gershgorin circle theorem if you are curious about where eigenvalues lie https://en.wikipedia.org/wiki/Gershgorin_circle_theorem Also this inspection is a special case of the Singular Value Decomposition (SVD) – mathreadler Oct 04 '17 at 06:30
  • 1
    This answer might help. The key is that you break the matrix up into the sum $aI+b\mathbf 1$. In your second example, $a=6$ and $b=1$. – amd Oct 04 '17 at 06:42
  • I think I'm following now @amd, so in that first example a = 1 so they used $1.I$ but in the second example I use $6.I$ – Bucephalus Oct 04 '17 at 06:47
  • So I use whatever $aI$ I need to get it to three congruent rows to get the multiplicity of 2. I think that is what you are saying. That's very helpful. @amd – Bucephalus Oct 04 '17 at 06:48
  • By George, I think you’ve got it! ;) A useful fact here is that if $\lambda$ is an eigenvalue of $A$, then $\lambda+\mu$ is an eigenvalue of $A+\mu I$ and vice-versa. – amd Oct 04 '17 at 06:49
  • Wow, thanks @amd, that trick with the trace(A) is a screamer. I'm sure the technique you outlined in that post will have positive ramifications in my exam tomorrow. If you can post that link as an answer I can close the question. Thanks. – Bucephalus Oct 04 '17 at 06:57
  • H. H. Rugh’s answer generalizes this decomposition. – amd Oct 04 '17 at 19:34

1 Answers1

2

Consider an $n$ by $n$ matrix of the form $A = a {\bf 1}_n +b (v v^t)$ with ${\bf 1}_n$ being the unit matrix and $v$ any $n$-vector.

Then $Av= av + b v (v^t v) = \lambda v$ with eigenvalue: $\lambda = a + b (v^t v)$ of multiplicity 1. When $z$ is orthogonal to $v$ then $Az= az$. The dimension of the orthogonal complement to $v$ is $n-1$ so you have $(n-1)$ eigenvalues equal to $a$.

In the case of $n=3$ and $v^t = [1 1 1]$ you have $v^t v=3$ so the eigenvalues are $a+3b$ with multiplicity 1 and $a$ with multiplicity 2. The diagonal elements equal $a+b$ and the off-diagonal elements equal $b$.

H. H. Rugh
  • 35,236