1

Consider some $n \times n$ matrix $\mathbb{M}$ which is diagonalizable with $n$ distinct eigenvalues $\{ \lambda_{j} \}_{j=1}^{n}$. Suppose that the matrix is not symmetric, and consider the corresponding right eigenvectors $\{ \mathbf{r}_{j} \}_{j=1}^{n}$ and left eigenvectors $\{ \boldsymbol{\ell}_{j} \}_{j=1}^{n}$ which satisfy the relations $$ \mathbb{M} \mathbf{r}_{j} = \lambda_{j} \mathbf{r}_{j} \ ,\\ \boldsymbol{\ell}_{j}^{T} \mathbb{M} = \boldsymbol{\ell}_{j}^{T} \lambda_{j} \ . $$ If you construct the matrices $$ \mathbb{R} := \left[ \begin{matrix} \mathbf{r}_{1} & \cdots & \mathbf{r}_{n} \end{matrix} \right] \quad \quad \mathrm{and} \quad \quad \mathbb{L} := \left[ \begin{matrix} \boldsymbol{\ell}_{1}^{T} \\ \vdots \\ \boldsymbol{\ell}_{n}^{T} \end{matrix} \right] \ , $$ then this question seems to suggest that $\mathbb{R}$ and $\mathbb{L}$ are inverses of each other.

How do you prove the statement $\mathbb{L} \mathbb{R} = \mathbb{I}$? I am unable to make progress with this claim although it seems true.

  • They won't always be inverses of each other, but we can choose $r_j,\ell_j$ such that they are. – Ben Grossmann Mar 09 '20 at 18:07
  • Is it by freedom of scaling the eigenvectors? Otherwise, $\mathbf{L} \mathbf{R}$ is just a diagonal matrix in general? I am noticing that you need to seem to pick them such that $\boldsymbol{\ell}{j}^{T} \mathbf{r}{k} = \delta_{jk}$ – QuantumEyedea Mar 09 '20 at 18:09

1 Answers1

3

Note that if $\lambda_i \neq \lambda_j$, then we have $$ \ell_i^T \Bbb M r_j = \ell_i^T(\lambda_j r_j) = \lambda_j (\ell_i^T r_j)\\ \ell_i^T \Bbb Mr_j = \lambda_i (\ell_i^Tr_j). $$ That is, we have $\lambda_j (\ell_i^T r_j) = \lambda_i (\ell_i^Tr_j)$ and hence $\lambda_i^T r_j = 0$, necessarily.

So, if all eigenvalues are distinct, then $\Bbb L \Bbb R$ will necessarily be diagonal. If we scale eigenvectors appropriately, then we can indeed have $\Bbb L = \Bbb R$.

The case of repeated eigenvalues is a bit trickier, but a similar statement can be made.


Claim: If $\Bbb R$ is such that every column of $\Bbb M$ is a right eigenvector, then every row of $\Bbb R^{-1}$ will be a left eigenvector.

Proof: Let $\Bbb D$ denote the diagonal matrix with diagonal entries $\lambda_1,\dots,\lambda_n$. Note that we have $M\Bbb R = \Bbb R \Bbb D$. By multiplying the right and left by $\Bbb R^{-1}$, we find that that $\Bbb R^{-1} \Bbb M = \Bbb D \Bbb R^{-1}$.

If $e_i$ is the $i$th row of the identity matrix, then the $i$th row of $\Bbb R^{-1}$ is given by $e_i^T \Bbb R^{-1}$, and we have $$ (e_i^T\Bbb R^{-1}) \Bbb M = e_i^T (\Bbb R^{-1}\Bbb M) = e_i^T (\Bbb D \Bbb R^{-1}) = (e_i^T \Bbb D) \Bbb R^{-1} = \lambda_i e_i^T \Bbb R^{-1}. $$

Ben Grossmann
  • 225,327