1

enter image description here

In the blue bolded line, the author claims that because A = 5 eye(4) - ones(4), eigenvalues(A) = eigenvalues(5 eye(4)) - eigenvalues(ones(4)) .

$Ax = \lambda_A x$

$(5*eye(4) - ones(4)) x = \lambda_A x$

$(\lambda_{5*eye(4)} - \lambda_{ones(4)})x = \lambda_A x$

The jump between the previous two equalities is only possible if the eigenvectors corresponding to the eigenvalues of 5*eye(4) and ones(4) are the same, right? So why didn't the person who wrote this solution make that argument? I do not think in general it is possible to add eigenvalues of summands to get the eigenvalues of a sum like the solution author is doing.

user3180
  • 699
  • 1
    Yes, in general you cannot simply do that. But because $5 \text{eye}(4)$ is a multiple of the identity, it is ok to make this step in this case. – angryavian Aug 24 '20 at 06:28
  • @angryavian But let's go further - why is adding a scaled identity ok? My claim is in general, you can add eigenvalues of matrix A and B to get eigenvalues of (A+B) if there is an intersection between the eigenspaces for each $\lambda_{A,i}$ and $\lambda_{B,i}$. And then the intersecting eigenspace is the new eigenspace corresponding to $\lambda_{A,i}$ + $\lambda_{B,i}$. – user3180 Aug 24 '20 at 06:35
  • It is clear that if $x$ is an eigenvector of matrix A corresponding to $\lambda_A$ and matrix B corresponding to $\lambda_B$, it will also be an eigenvector for matrix A+B corresponding to $\lambda_A + \lambda_B$. However, I'm not able to prove that the eigenvectors formed this way produce an eigenbasis for A+B. – user3180 Aug 24 '20 at 06:48

1 Answers1

3

For any matrix $J$, if $Jx=\lambda x$ then $(J+\alpha I)x=(\lambda+\alpha)x$.
Conversely, if $(J+\alpha I)x=\mu x$ then $Jx=(\mu-\alpha)x$.

So the eigenvalues of $J+\alpha I$ are precisely $\lambda_J+\alpha$.

The general theorem in this direction is the spectral mapping theorem: The eigenvalues of $p(J)$ are $p(\lambda_J)$ for any polynomial $p$ (or even any analytic function defined on a neighborhood of the eigenvalues).


Edit: The result extends to any matrix $B$ which shares exactly the same eigenvectors as $J$. That is, $(J+B)x=(\lambda_J+\lambda_B)x$.

However, if that is the case, then let $P$ be the matrix of eigenvectors, so $P^{-1}JP=D_J$ and $P^{-1}BP=D_B$. If the $n$ pairs $(\lambda_J,\lambda_B)$ satisfy some polynomial, that is, $D_B=p(D_J)$, then $B=Pp(D_J)P^{-1}=p(PD_JP^{-1})=p(J)$. So this case is part of the spectral mapping theorem.

The most general case is when $BJ=JB$ and they are diagonalizable. See Simultaneous diagonalizability and matrix polynomial

Chrystomath
  • 10,798
  • Can you explain why this does hold if, for A+B, B is diagonal, but why does it not hold if B is not diagonal? Why can't we make the same argument you make in your first line with B being an arbitrary matrix. – user3180 Aug 24 '20 at 07:16
  • 1
    @user3180 Because $J$ and $B$ need not share any eigenvectors. $(J+B)x=\lambda x+ Bx$. There may not be any relation between $Bx$ and $x$. IF they do, then you can say something about that eigenvector only. – Chrystomath Aug 24 '20 at 07:20
  • So, why limit B to the class of diagonal matrices? Any B that shares eigenvectors with A can be added. It just so happens that diagonal matrices always have eigenspaces that span all of $R^n$, but any matrix B with eigenspaces that contain the eigenspaces of A can allow summing of the eigenvalues. – user3180 Aug 24 '20 at 07:24
  • @user3180 You're right, but that case is already covered by the spectral mapping theorem. See the edit. – Chrystomath Aug 24 '20 at 07:54
  • Wait, it's not that B must share exactly the same eigenvectors as J, right? In the initial post, where B is diagonal, B has an eigenbasis spanning $R^n$ while J (the X = H... above the pagebreak) does not span $R^n$, so the possible eigenvectors are not exactly the same. It's that the eigenspace of B contains the eigenspace of J – user3180 Aug 24 '20 at 08:18
  • Also, one issue I am having. Although it is easy to show that the independent eigenvalues and eigenvectors of B and J, given our containment criterion, produces eigenvalues and eigenvectors for B+J, how do we show that the independent/summand eigenvalues and eigenvectors produces all eigenvalues and eigenvectors for B+J? – user3180 Aug 24 '20 at 08:22
  • 1
    (i) "$J$ does not span $R^n$". It's not $J$ that needs to span but its eigenvectors. These are in fact $H$ the same as the eigenvectors of $A$. Moreover, if the eigenspaces of $B$ contain those of $J$, then they must be equal (by counting dimensions), so they would agree on exactly the same eigenvectors. – Chrystomath Aug 24 '20 at 08:30
  • 1
    (ii) "How do we show... all eigenvalues of $B+J$". That's the converse part of the answer. – Chrystomath Aug 24 '20 at 08:31