6

For a matrix $A \in \mathbb{C}^{n \times n}$, does multiplication by a unitary matrix $U$ change the eigenvalues of $A$? So for:

$$Ax = \lambda x \qquad \mathrm{and} \qquad AUy = \mu y $$

does $\lambda = \mu$ for some $x,y \in \mathbb{C}^n$?

I know the above is true for doing left and right multiplication by $U$:

$$ UAU^*y = \mu y \\ AU^*y = \mu U^* y \\ Az = \mu z \\ \therefore \mu = \lambda$$

(defining $z = U^* y$)

Under the guise that unitary matrices are simply rotations, it logically makes sense to me that $\mu$ and $\lambda$ should be identical, and only the the eigenvectors should be different. The statement is true for singular values (see here), but I'm having trouble proving it for eigenvalues (if it even is true).

Edit

After a quick example in python, I understand that the above is not true. So instead: where is my thought process going wrong with regard to how unitary matrices/rotations effect eigenvalues?

Edit 2

What I was really going for, but did not state correctly was that:

$$AX = \Lambda X \qquad \mathrm{and} \qquad AUY = MY \\$$

such that $ \lambda \in M, \forall \lambda \in \Lambda$, where $M$ and $\Lambda$ are diagonal matrices.

  • Are you assuming no relationship between $x$ and $y$? Take $U$ to be the identity matrix (which is unitary): then it looks like what you're asking is whether $Ax=\lambda x$ and $Ay=\mu y$ imply $\lambda = \mu$, which is basically asking whether any two eigenvalues of $A$ are equal, which is of course not true. – lisyarus Feb 11 '21 at 19:23
  • I'm mainly focused on the eigenvalues here. I'd imagine there would be a relationship between $x$ and $y$ in your example (in that case, $y = Ix$). – James Wright Feb 11 '21 at 19:25
  • Also, note my edit. I now know that my initial assertion was wrong, so I'm more trying to figure out where my understanding is wrong. Unless I'm missing something important in your example using $U=I$. – James Wright Feb 11 '21 at 19:26
  • You are stating some fact (which you already know to be false) and asking where does your thought process go wrong when deriving this fact; I'm saying that the fact is wrong by a much simpler argument, namely that it implies that any matrix has at most 1 eigenvalue. This has nothing to do with rotations or eigenvalues per se, it's just that in your current formulation $x$ and $y$ are two unrelated vectors and thus there's hardly anything useful you could say about them in general. – lisyarus Feb 11 '21 at 19:32
  • I guess I'm hoping for an explanation in terms of the effect that the unitary matrix has on $A$ such that the above statement is false (or that the above statement implies there is at most 1 eigen value). Something in terms of U changing range/null space. – James Wright Feb 11 '21 at 19:41
  • In other words, I made a statement based on intuition, that I later found out (and you gave a simple example proving) was wrong. Now I'm trying to figure out where my intuition was wrong. – James Wright Feb 11 '21 at 19:45
  • I strongly believe we're having a huge misunderstanding here. As I've said, this has nothing to do with $U$ or how it acts on anything. Let's start simpler: do you understand why $Ax=\lambda x$ together with $Ay = \mu y$ does not imply $\lambda = \mu$? – lisyarus Feb 11 '21 at 19:46
  • I'm not sure I understand the question. But I know that $\mu = \lambda$ iff $\lambda$ has multiplicity greater than 1 or $x = y$. – James Wright Feb 11 '21 at 19:50
  • Consider $A=\begin{pmatrix} 2 & 0 \ 0 & 3 \end{pmatrix}$. We have $Ax=\lambda x$ with $x=\begin{pmatrix}1\0\end{pmatrix}$ and $\lambda=2$, while $Ay=y$ with $x=\begin{pmatrix}0\1\end{pmatrix}$ and $\mu=3$. Yet, $\lambda \neq \mu$, that is, $2 \neq 3$. – lisyarus Feb 11 '21 at 19:55
  • I think my understanding/intuition might be better served in trying to understand the differences between eigenvalues vs singular values. ie. Why the statement is true for singular values and not eigenvalues. – James Wright Feb 11 '21 at 20:01
  • It is not true that any singular value of a matrix is equal to any singular value of the rotated matrix (which is your your statement with eigenvalues replaces with singular values). What is true is that there is a bijective correspondence between singular values of a matrix and singular values of the rotated matrix. – lisyarus Feb 11 '21 at 20:09
  • Ok, maybe that's the misunderstanding you were talking about before. I'm referring to having a bijective correspondence eigenvalues of a matrix and of it's rotated matrix. So that the spectrum of $A$ has the same spectrum as $AU$. – James Wright Feb 11 '21 at 20:14
  • I've updated the question with what (I think) is more along the lines of what I was thinking. – James Wright Feb 11 '21 at 20:26
  • @JamesWright The absolute value of the product of all the eigenvalues of $A$ and the absolute value of the product of the all the eigenvalues of $AU$ will be equal. This can be proven easily by noting that the (1) the determinant is equal to the product of eigenvalues of a matrix, (2) $\text{det}(AU) = \text{det}(A)\text{det}(U)$ and (3) $|\text{det}(U)| = +1$. – Jagerber48 Dec 07 '21 at 14:46
  • For intuition building I've been seeking an algebraic proof of the facts I mentioned in the last comment without relying on the "complicated" alternating multilinear formulas for $\text{det}(A)$, but have not yet had success. See https://math.stackexchange.com/questions/4325195/proof-that-textdetab-textdeta-textdetb-without-explicit-expres?noredirect=1#comment9017897_4325195 – Jagerber48 Dec 07 '21 at 14:47

1 Answers1

4

Your intuition is somewhat correct, but you are doing it wrong. When you mean a rotation, to what does it apply? To vectors, right?

So, if $A$ is the matrix of a linear map , we expect that if we rotates the axes, the the matrix changes, but the application is still the same, so the eigenvalues should be the same. Let's check this.

Let $f$ be a linear map from an $n$-dimensional vector space $E$ into itself, and $\mathcal B$ a basis of $E$. Let $A$ the matrix of $f$ in that basis. And let another basis $\mathcal B'$, with a matrix $P$ of change of basis, which is regular. That is, if $v$ is a vector in $E$ with coordinates $X\in\Bbb R^n$ in the basis $\mathcal B$, then the coordinates $X'$ in the basis $\mathcal B'$ satisfy (note the order):

$$X=PX'$$

Now, in the new basis, $f$ has another matrix representation $A'$, that is given by

$$A'=P^{-1}AP$$

How does it impact eigenvectors and eigenvalues? Let $u$ an eigenvector of $A$ for the eigenvalue $\lambda$, then $Au=\lambda u$ and, since $A=PA'P^{-1}$

$$PA'P^{-1}u=\lambda u$$

Hence

$$A'P^{-1}u=P^{-1}\lambda u=\lambda P^{-1}u$$

That is, $\lambda$ is also an eigenvalue of $A$, for the eigenvector $u'=P^{-1}u$. Which we can write $u=Pu'$. For the eigenvector $u'$, we simply apply the change of basis: it's the same vector in $E$, only its representation in a basis changes.

That is, the eigenvectors and eigenvalues of a linear map don't depend on the basis.

Note that for this we don't require $P$ to be unitary.

  • It is my understanding that the matrix associated with a linear map is with respect to two bases. I'm guessing from what you're saying, it's not possible to apply the change of basis to only one of these bases? – James Wright Feb 11 '21 at 22:14
  • 1
    @JamesWright You are right. Here I consider an endomorphism, i.e. a linear map from $E$ to $E$. It's required to even consider eigenvalues. I changed the answer to make it more explicit. – Jean-Claude Arbaut Feb 11 '21 at 22:15
  • Is it also correct to say that the matrix associated with a linear operator $f: E \rightarrow E$ must be with respect to one basis of $E$? Ie. the pair of bases maybe $(\mathcal{B}, \mathcal{B})$ (in the case of $A$) or $(\mathcal{B}', \mathcal{B}')$ (in the case of $A'$), but not $(\mathcal{B}, \mathcal{B}')$? – James Wright Feb 11 '21 at 22:23
  • 1
    @JamesWright Eigenvalues and eigenvectors are defined for $f$, and don't rely on a basis (here I proved that they are independent of the basis chosen). It's not impossible to use two different bases, but then the relation $Au=\lambda u$ won't hold, and everything will be more complicated. The relation $f(v)=\lambda v$ will have to be written $Au_{\mathcal B}=\lambda u_{\mathcal B'}$, then use $P$ if you want to at least have the same representation of $v$. I see no reason to do that, and I think it's not a very good idea. – Jean-Claude Arbaut Feb 11 '21 at 22:30
  • Aha! Despite it not being a good idea, I think that example did clear things up a good bit for me. Thanks! – James Wright Feb 11 '21 at 22:41