Usually we search the eigenvectors of a matrix $M$ as the vectors that span
a subspace that is invariant by left multiplication by the matrix: $M\vec x= \lambda \vec x$.
If we take the transpose problem $\mathbf x M=\lambda \mathbf x$, where $\mathbf x$ is a row vector, we see that the eigenvalues
are the same, but the ''eigenrows'' are not the transpose of the eigenvector (in general).
E.g., for the matrix
$$\left[
\matrix{0&1\\2&1}
\right]
$$
we find, for the eigenvalue $\lambda=2$,
$$
\begin{bmatrix}
0&1\\2&1
\end{bmatrix}
\left[
\matrix{1\\2}
\right]=
2\left[
\matrix{1\\2}
\right]
$$
$$\left[
\matrix{1&1}
\right]
\begin{bmatrix}
0&1\\2&1
\end{bmatrix}
=
2\left[
\matrix{1&1}
\right] \;\;.
$$
So my questions are: Is there some relation between these ''right'' and ''left'' eigenspaces of the
same matrix? Is there some reason why the ''eigenrows'' are not so studied as the eigenvectors?