2

Give a matrix $\textbf{A} \in \mathbb{C}^{n\times n}$ with eigenvalues $\lambda_1, ..., \lambda_n$ and eigenspaces $$ E_{\lambda_j} = \{\textbf{v} \in \mathbb{C}^n \mid \textbf{Av} = \lambda_j \textbf{v}, j = 1,...,n \}, $$ is there an $\underline{easy}$ way to construct a matrix $\textbf{B} \in \mathbb{C}^{n\times n}$ with the same eigenspaces as $\textbf{A}$, but with different eigenvalues? Say for example, you wanted $\textbf{B}$ to have its dominant $k$ eigenvalues match the dominant $k$ eigenvalues of $\textbf{A}$, but have opposite sign. Would there be a $\underline{fast}$ way to accomplish this, without having to compute the eigenvalues of $\textbf{A}$ ahead of time?

One possibly relevant fact that I know is that diagonalizable matrices commute if and only if they have the same eigenvectors (Simultaneous diagonalization).

I am looking at this problem from a numerical analysis perspective, so methods that require me to compute all the eigenvectors ahead of time would not be useful for my application. Posts providing relevant theorems or useful observations that may be good to know are appreciated. Partial answers and ideas are also appreciated.

A. B. Marnie
  • 1,282
  • 2
    $\textbf{A}^2$ will have the same eigenvectors as $\textbf{A}$, more generally for any polynomial $p$ the matrix $p(\textbf{A})$ will have the same eigenvectors. From here you can go further by replacing $p$ with more general functions, which, however, requires more advanced techniques. – user8268 Mar 25 '17 at 19:05
  • This would be a good method if we know a lot of information about the spectrum of $\textbf{A}$. It is also useful for forming a positive/negative semidefinite definite matrix $\textbf{B}$ with the same eigenbasis of $\textbf{A}$ since $\textbf{B} = \textbf{A}^2$ would have nonnegative eigenvalues. Thanks for the comment.- – A. B. Marnie Mar 25 '17 at 19:29

2 Answers2

2

I am implicitly assuming that $A$ is diagonalizable, although we can use generalized eigenvectors via the Jordan canonical form as well.

Let $P$ be the matrix which diagonalizes $A$, namely $$A = PD_AP^{-1},$$ where the diagonal matrix $D_A$ displays the eigenvalues of $A$ in some desired order. Note that $P$ is precisely the matrix formed with the corresponding eigenvectors of $A$ as the columns.

If you want to change the spectrum while retaining the eigenspace structure, all you need to do is change the diagonal matrix to display the eigenvalues you desire. Namely, for any diagonal matrix $D$, the matrix $$B = PDP^{-1}$$ will have the same eigenspace structure (if you break degeneracy, i.e., reduce the algebraic multiplicity of the eigenvalues in the process, then you are free to choose which subspaces of the original eigenspaces become the new eigenspaces by choosing the columns of $P$ appropriately), but with eigenvalues displayed on $D$.

In particular, a very general thing we can define is the functional calculus of a diagonalizable matrix $A$. Namely, given some function $f$, how should we reasonably define $f(A)$? One suitable way of defining $f(A)$ is via $$f(A) = Pf(D_A)P^{-1},$$ where $f(D_A)$ is the diagonal matrix obtained from $D_A$ by applying $f$ entrywise. In this way, we can have arbitrary functional dependence of the eigenvalues of $A$, while maintaining the general eigenspace structure.

EuYu
  • 41,421
  • This is a good answer, but I probably didn't stress the fast part of my question enough. I am looking at this problem from a numerical analysis perspective, where I would like to precondition $\textbf{A}$ to make finding its eigenvalues easier. In other words, methods that require me to compute the eigenvalues/eigenvectors of $\textbf{A}$ ahead of time are not useful. I upvoted your answer anyways since I probably wasn't clear enough in my question (and since your method obviously works). – A. B. Marnie Mar 25 '17 at 19:23
2

Suppose that we are given a normal matrix $\rm A$. Suppose that we know that $(\lambda, \rm v)$ is an eigenpair of $\rm A$ and that we would like to build a matrix $\rm B$ that has the same eigenpairs as matrix $\rm A$ except that the eigenvalue corresponding to eigenvector $\rm v$ is $\mu$. Hence,

$$\rm B := A - \lambda v v^* + \mu v v^* = A + (\mu - \lambda) v v^*$$

For instance, suppose that we would like to reverse the sign of the eigenvalue corresponding to eigenvector $\rm v$, i.e., $\mu = - \lambda$. Hence,

$$\rm B = A - 2 \lambda v v^* = \left( I - 2 v v^* \right) A$$

where $\rm \left( I - 2 v v^* \right)$ is a Householder matrix. Thus, reversing the sign a single eigenvalue can be done without computing all the eigenvalues and eigenvectors by left-multiplying the matrix by the appropriate Householder matrix.

  • Related: http://math.stackexchange.com/a/1805122/339790 – Rodrigo de Azevedo Mar 25 '17 at 19:39
  • 1
    In general, I guess having $A$ be normal should be sufficient? – EuYu Mar 25 '17 at 19:47
  • @EuYu Good question. I think that is correct because $$\rm \left( I - 2 v v^* \right) A = A - 2 v v^* A = A - 2 v (A^* v)^* = A - 2 v ( \lambda^* v )^* = A - 2 \lambda v v^*$$ Do you agree? – Rodrigo de Azevedo Mar 25 '17 at 20:06
  • Yes. I think it's necessary and sufficient for $A$ to be unitarily diagonalizable for this procedure to work. So if and only if $A$ is normal. – EuYu Mar 25 '17 at 20:42