3

I have a matrix $M$ and a vector $x$. I would like to compute the vector $y= M^n x$.

I found the eigenvalues $λ_k$ of $M$ and the left and right eigenvectors, properly normalized. If M was symmetric, I would now project $x$ on each eigenvector, multiply by $λ_k^n$, and then recombine them. That would give me $y$.

But $M$ is not symmetric, Is it any useful that I have the left and right eigenvectors? Would you know how to compute $y$?

EDIT (in answer to Florian below): the matrix whose rows are the right eigenvectors of $M$ is too hard to invert, so my question is whether one can to bypass the inversion by using the fact that one knows both sets of eigenvectors?

Jennifer
  • 879
  • 1
  • 9
  • 17
  • It is an advantage to have the right eigenvectors. Since $M$ is not symmetric, there is some doubt a priori whether a basis of right eigenvectors exists. It would be a lot of work to get them, but in principle one could express $M$ as a linear combination of them, and powers of $M$ times $x$ by inspection. – hardmath Dec 25 '16 at 22:40
  • I have the eigenvectors, left nad right, each with multiplicity $1$. To make sure I understand, could you express what you just said as a formula, I mean "expressing $M$ as a linear combination of them"? – Jennifer Dec 25 '16 at 22:41
  • Can you calculate y numerically? – kotomord Dec 25 '16 at 22:43
  • Of course, but since I've obtained a closed form for the left&right eigenvectors, I'm thinking if I can using this fact to obtain a closed form for $y$. – Jennifer Dec 25 '16 at 22:44
  • How is n related to the size of the matrix? – abel Dec 26 '16 at 02:14
  • There is no relation. – Jennifer Dec 26 '16 at 08:31

2 Answers2

2

If you have a complete basis of right eigenvectors for $M$, say $\{v_1,v_2,\ldots,v_\ell\}$ such that $Mv_k = \lambda_k v_k$, then finding $M^n x$ would be easy if we knew the linear combination $x = \sum_{k=1}^\ell c_k v_k$:

$$ M^n x = \sum_{k=1}^\ell c_k \lambda_k^n v_k $$

The OP asks in an edit whether also knowing the left eigenvectors will help us to get that linear combination. In the present Question the OP stipulated in the Comments that these were complete bases and that the multiplicity of each eigenvalue is $1$.

The fact is that left and right eigenvectors of distinct eigenvalues are orthogonal. This sort of thing has been shown in previous Questions here, but often the discussion is confounded by additional matrix properties (like normality) that are unnecessary. So let's give the one-line proof.

Suppose a right eigenvector $v$ with $Mv = \lambda v$ and a left eigenvector $u^T$ with $u^T M = \mu u^T$. If $\lambda \neq \mu$, then:

$$ \lambda (u^T v) = u^T M v = \mu (u^T v) $$

by combining $M$ either with the left or the right eigenvector, which implies that $u^T v$ must be zero, i.e. $(\lambda - \mu)(u^T v) = 0$ but the first factor is nonzero.

Therefore if the left eigenvectors $\{u_1,u_2,\ldots,u_\ell\}$ are ordered so that $u_j^T M = \lambda_j u_j^T$ to match the order of right eigenvectors, then taking "inner products" gives the coefficients:

$$ u_j^T x = \sum_{k=1}^\ell u_j^T (c_k v_k) = c_j (u_j^T v_j) $$

because all the omitted terms in the summation are zero. Thus:

$$ c_j = \frac{u_j^T x}{u_j^T v_j} $$


Added:

Since this post was motivated to an extent by concern that the inverse of the matrix $B$ whose columns are (the known basis of) right eigenvectors might be difficult to compute, it's worth noting what the existence of the basis of left eigenvectors says about forming that inverse.

Let $A$ be the $\ell \times \ell$ matrix whose rows are left eigenvectors in a compatible eigenvalue order as the right eigenvectors (columns of $B$). In terms of the notation used above:

$$ A = \begin{pmatrix} u_1^T \\ u_2^T \\ \vdots \\ u_\ell^T \end{pmatrix} \; , \; B = \begin{pmatrix} v_1 & v_2 & \cdots & v_\ell \end{pmatrix} $$

As shown above, $AB$ is a diagonal matrix $D_{jj} = u_j^T v_j$, explaining the comment of Robert Israel. Since $A,B$ are invertible (being of rank $\ell$), $D$ also is invertible. Thus:

$$ B^{-1} = D^{-1} A$$

The computation of $B^{-1}$ is thus tractable once the left eigenvalue/eigenvector pairs are known. From this we can conclude that $M$ is diagonalized by the similarity transformation $B^{-1} M B$.

hardmath
  • 37,015
1

Is $M$ diagonalizable? If so, you can write $M$ as $M=P \cdot \Lambda \cdot P^{-1}$, where $P$ is a matrix containing the eigenvectors of $M$ and $\Lambda$ is a diagonal matrix containing the eigenvalues $\lambda_k$. In this case, we have that $M^n = P \cdot \Lambda^n \cdot P^{-1}$. Based on this result you can do what you intended: "project" $x$ by computing $P^{-1} \cdot x$, then multiply by $\lambda_k^n$ and then recombine using $P$.

This is possible if $M$ is diagonalizable which is true if and only if for each eigenvalue of $M$, its geometriy and algebraic multiplicity agree.

Florian
  • 2,610
  • 1
    I think that $M$ is diagonalizable, because all its eigenvalues have multiplicity $1$. The problem is precisely that I cannot compute $P^{-1}$, because the matrix $P$ is very complicated. That's why I was trying to find an alternative procedure, maybe by employing the right eigenvectors as well. – Jennifer Dec 25 '16 at 22:38
  • Okay, in this case my answer doesn't help. I wasn't aware of the problem in computing $P^{-1}$ from your question. – Florian Dec 25 '16 at 22:41
  • Are what you call left eigenvectors the row vectors satisfying $x\cdot M = \lambda \cdot x$? In this case I'd almost claim that they should be rows of $P^{-1}$ so if you concatenate them row-wise you should have $P^{-1}$ without any need for matrix inverse. If not, please explain what exactly you mean by left eigenvectors. – Florian Dec 25 '16 at 22:48
  • Yes, that's what I mean by left eigenvectors. And I also thought I could do that, but I asked on MSE, and John Hughes seems to have found a counterexample: math.stackexchange.com/questions/2071964 – Jennifer Dec 25 '16 at 22:59
  • This is odd: the left eigenvectors of $M$ are the right eigenvectors of $M^T$. Therefore if $M=P \cdot \Lambda \cdot P^{-1}$ then $M^T = P^{-T} \cdot \Lambda \cdot P^T$ which is an EVD. If the eigenvalues are distinct, the EVD is unique (up to scaling of eigenvectors), which proves my point that left eigenvectors are rows of $P^{-1}$. But I see your other question is still being debated, this probably fits better over there. – Florian Dec 25 '16 at 23:04
  • I really wonder what's wrong with your proof. At http://math.stackexchange.com/questions/2071964/left-right-eigenvectors/2071993?noredirect=1#comment4254867_2071993, more counterexamples have been found. I'm pasting your proof there anyway, with your permission. – Jennifer Dec 25 '16 at 23:23
  • I posted it there. – Florian Dec 26 '16 at 22:11