8

What are the eigenvalues of the linear operator in vector space $M_n(\mathbb R)$ $$ f(X) = AXA^T $$ and $$ f(X) = AXA^{-1} $$ when eigenvalues of $A$ are $ \lambda_1, \lambda_2, ..., \lambda_n $?

I suspect that in first case is $\{\lambda_i \cdot \lambda_j \ | \ i,j \in \{1,2,3, ..., n\}\} $ and in second $\{\lambda_i/\lambda_j \ |\ i,j \in \{1,2,3, ..., n\}\} $, but I can't prove it.

More generally what are eigenvalues of $$ f(X) = AXB $$ when we know eigenvalues of $A$ and $B$?

nonuser
  • 90,026
J. Doe
  • 158
  • I would start by trying some $2 \times 2$ cases. Make sure you try some non-diagonalisable matrices too, since they can be tricky. This might help better formulate a conjecture about the general case, and illuminate ways of proving it. – Theo Bendit Jan 10 '18 at 23:57

2 Answers2

6

The key observation is that you can transform your expressions by using Kronecker product (sometimes called tensor product) in the following way:

$$f(X)=AXA^T=((A^T)^T \otimes A) vec(X)=(A \otimes A)vec(X)$$

You will find all the details in (https://en.wikipedia.org/wiki/Kronecker_product); see in particular the paragraph "Matrix equations" then "Abstract properties : Spectrum".

Same comment more generally for :

$$f(X)=AXB=(B^T \otimes A) vec(X).$$

where $B^T \otimes A$ is a $n^2 \times n^2$ matrix, and $vec(X)$ is the $n^2 \times 1$ column vector obtained by "piling up" the $n$ columns of $X$.

Jean Marie
  • 81,803
  • 1
    If one is interested in the eigenvectors of the map also, observe that if $Ax=\lambda x,$ $B^{T}y=\gamma y,$ then $X=xy^{T}$ satisfies $AXB=\lambda\gamma X.$ – RideTheWavelet Jan 11 '18 at 00:25
  • Can I ask why eigenvalues of Kronecker product are products of eigenvalues of matrices? – J. Doe Jan 11 '18 at 00:31
  • 1
    See theorem 13.12 of this document (http://www.siam.org/books/textbooks/OT91sample.pdf). – Jean Marie Jan 11 '18 at 07:04
6

Partial answer. I'll use the technique of this answer to answer your question subject to an additional assumption.

Theorem. If $A$ and $B^T$ are $n\times n$ matrices with $n$ linearly independent eigenvectors $\def\vec#1{{\bf#1}}\vec v_1,\ldots,\vec v_n$ and $\vec w_1,\ldots,\vec w_n$ respectively and eigenvalues $\lambda_i$ and $\mu_j$ respectively, then the transformation $$f:M_n\to M_n\ ,\quad f(X)=AXB$$ has eigenvectors $$X_{ij}=\vec v_i\vec w_j^T$$ with eigenvalues $\lambda_i\mu_j$.

Proof. We have $$f(X_{ij})=A\vec v_i\vec w_j^T B=A\vec v_i(B^T\vec w_j)^T =\lambda_i\mu_jX_{ij}\ .$$ The $X_{ij}$ are linearly independent (see above link), so in particular none of them is zero and they are all eigenvectors. Finally, we now have $n^2$ linearly independent eigenvectors in a space of dimension $n^2$, so there are no more to be found.

David
  • 82,662