53

I know that in $A\textbf{x}=\lambda \textbf{x}$, $\textbf{x}$ is the right eigenvector, while in $\textbf{y}A =\lambda \textbf{y}$, $\textbf{y}$ is the left eigenvector.

But what is the significance of left and right eigenvectors? How do they differ from each other geometrically?

cconsta1
  • 103

4 Answers4

41

The (right) eigenvectors for $A$ correspond to lines through the origin that are sent to themselves (or $\{0\}$) under the action $x\mapsto Ax$. The action $y\mapsto yA$ for row vectors corresponds to an action of $A$ on hyperplanes: each row vector $y$ defines a hyperplane $H$ given by $H=\{\text{column vectors }x: yx=0\}$. The action $y\mapsto yA$ sends the hyperplane $H$ defined by $y$ to a hyperplane $H'$ given by $H'=\{x: Ax\in H\}$. (This is because $(yA)x=0$ iff $y(Ax)=0$.) A left eigenvector for $A$, then, corresponds to a hyperplane fixed by this action.

Owen Biesel
  • 2,505
  • "The action $y\mapsto yA$ for row vectors corresponds to an action of $A$ on hyperplanes ..." or rather to an action of $A^{-1}$ on hyperplanes? Note that $H'$ is the pre-image of $H$ under $A$, not the image. Could you clarify that? – paperskilltrees Aug 02 '22 at 19:15
17

The set of left eigenvectors and right eigenvectors together form what is known as a Dual Basis and Basis pair.

http://en.wikipedia.org/wiki/Dual_basis

In simpler terms, if you arrange the right eigenvectors as columns of a matrix B, and arrange the left eigenvectors as rows of a matrix C, then BC = I, in other words B is the inverse of C

centaur
  • 371
  • I don't understand this. How BC=I?. I am not seeing this for many matrices. – Midhun Kathanaruparambil Aug 30 '16 at 02:01
  • Are you sure that's right, $BC=I$? I can't find a counterexample but I can't see how to prove it either. – Jennifer Dec 25 '16 at 22:26
  • Here is an example - the columns of $P$ and the rows of $P^{-1}$ When we diagonalize a matrix. - https://en.wikipedia.org/wiki/Diagonalizable_matrix#Diagonalization – Aditya P Jul 12 '18 at 04:33
  • 1
    If you multiply one of the right eigenvectors by a scalar other than zero or one, you get a different eigenvector. This will change the matrix $B$, but not $C$. So $BC=I$ is impossible (unless you have some way of normalizing your eigenvectors). – Gerry Myerson Mar 19 '21 at 12:13
  • @GerryMyerson Remember that eigenvectors (both right and left) can always be scaled by a non-zero scalar, which is why the bi-orthogonality condition $i\neq j\implies\boldsymbol{b}{i}^{T}\boldsymbol{c}{j}=0$ means $\boldsymbol{b}{i}^{T}\boldsymbol{c}{j}=\delta_{ij}$ without loss of generality. – paperskilltrees Aug 01 '22 at 11:51
  • Of course, there is no guarantee that $n$ linearly independent (right) eigenvectors exist (or, equivalently, $n$ left eigenvectors). Thus, this answer only applies to diagonalisable matrices. – paperskilltrees Aug 01 '22 at 12:27
10

Geometrically the matrix $A$ is an origin and line preserving transformation (${\bf v}\mapsto A\cdot{\bf v}$). The right eigenvectors are eigenvectors for this transformation, but the left ones for $A^T$, which, geometrically can be totally different.

However, the eigenvalues and the dimensions of their corresponding eigenspaces must stay the same.

M Turgeon
  • 10,419
Berci
  • 90,745
9

Using $A$ as a linear transformation on the right or on the left produces (in general) two completely different transformations of the vector space.

These two transformations have their own eigenvectors, which may have nothing to do with each other.

The geometric significance of eigenvectors is: they lie in subspaces which are stretched by $A$, but not tilted at all.

rschwieb
  • 153,510