0

I was told in lecture the other day that an eigenvalue represents the multiple of the volume of a vector space. For example, if I pour a jar of liquid from jar A into jar B, and express that in terms of some linear mapping T, then without calculating, T must have an eigenvalue of $1$ since its volume did not change.


Can someone elaborate more on this? Particularly:

1) Under what conditions is this true?

2) What are some important direct implications of this I should know?

3) How does the eigenvector corresponds to this idea?

4) How does this hold when there are multiple eigenvalues?

Thank you in advance.

B.Li
  • 808

1 Answers1

1

OK, this is not at all right. It does not even make sense. Let me speculate a little bit into what the "correct" statement was supposed to be.

First of all in a $n$-dimensional real vector space $\mathbb{R}^n$ the "volume" of the parallelepiped constructed from vectors $v_1, v_2, \cdots, v_n$ is calculated as $$ \mathrm{Vol}(v_1, v_2, \cdots, v_n) = |\det(v_1, \cdots, v_n)| $$ where by $\det (v_1, \cdots, v_n)$ I mean the determinant of the matrix in which its frist column is $v_1$, second column $v_2$, etc. in some basis. Why is the case? Define $$ \begin{aligned} v_1'&=v_1\\ v'_2&=v_2-\frac{v_2\cdot v'_1}{|v_1|^2} v'_1\\ v'_3&=v_3-\frac{v_3\cdot v'_1}{|v_1'|^2} v'_1-\frac{v_3\cdot v'_2}{|v'_2|^2} v'_2\\ \vdots\\ v_n'&= v_n - \sum_{j=1}^{n-1}\frac{v_n\cdot v'_j}{|v_j'|^2} v'_j \end{aligned} $$ the main idea is that $v_2'$ is the component of $v_2$ perpendicular to $v_1$, then $v_3'$ is the component of $v_3$ perpendicular to both $v_1$ and $v_2$, etc. (I'll leave the details to you). Now since the volume $\mathrm{Vol}(v_1, v_2, \cdots, v_n)$ is just the product of the lengths $|v'_1||v_2'|\cdots |v'_n|$. Since these vectors are mutually perpendicular, one finds that $$ \mathrm{Vol}(v_1, v_2, \cdots, v_n)=|\det (v'_1, \cdots, v'_n)|=|\det (v_1, \cdots, v_n)| $$ the last part is because adding a multiple of one column to another does not change the determinant.

OK, now consider a linear transformation $T:\mathbb{R}^n\to \mathbb{R}^n$. We would like to know how the volume $\mathrm{Vol}(v_1, v_2, \cdots, v_n)$ changes under this linear transformation. Note that $$ |\det(Tv_1, \cdots, Tv_n)|=|\det T||\det(v_1, \cdots, v_n)| $$ In other words such a transformation multiplies the volume by $|\det T|$. Now what does this have to with eigenvalues? The eigenvalues of the transformation $T$ are solution $p_T(x)$ which is of degree $n$. As a result, there are $n$ complex roots possibly with repretition, $\lambda_1, \cdots, \lambda_n$. Now if $\lambda_i$ is a complex root of $p_T(x)$, since this polynomial has real cooeficients, then $\lambda_i^*$ is also a root. Therefore, although not all of the roots are real, the product $$ V=\prod_{j=1}^n \lambda_j $$ is always real. At the same time $\det T = \prod_{j=1}^n \lambda_j$ (look here). So...

The absolute value of the product of the eigenvalues is the amount by which the volume of any parallelepiped is scale under $T$. You can understand this as the volume of the whole vector space being scaled by $|\det T|$.

It is important to take note of the repetition, for example, the matrix $\mathrm{diag}(2,2)$ scales the volume by four, not two.

No single eigenvalue has information about the volume, rather their product does. In general, if $V$ is an eigenvector of $T$ with a real eigenvalue $\lambda$, then you can understand $T$ as stretching the vector space in the direction of $V$ by an scaling amount $\lambda$ (if $\lambda$ is negative, it scales by an amount $|\lambda|$ followed by an inversion).

However, note all transformations admit real eigenvalues. For example, a $\theta\neq n\pi$ rotation $$R_\theta=\begin{pmatrix} \cos\theta & \sin\theta\\ -\sin\theta & \cos \theta \end{pmatrix}$$ in two dimensions has no eigenvectors since its eigenvalues is complex (they are $\lambda_\pm = e^{\pm i\theta}$). So it does not make sense to think of $R_\theta$ as scaling in any direction, or of $\lambda_\pm$ as scales in any direction. Nonetheless, the volume of the vector space does not scale under any rotation, and this is still reflected in the fact that $\det R_\theta = \lambda_+\lambda_-=1$.

In your Jar example, all you get out of knowing the volume is fixed is that the absolute value of the product of eigenvalues is equal to one. For example I can choose my matrix as $$ M=\begin{pmatrix} 8 & 0 & 0\\ 0 & -1/2 & 0\\ 0 & 0 & 1/4 \end{pmatrix} $$ This transformation does not change the volume. But none of its eigenvalues is one.

Hamed
  • 6,793
  • I assume the order the the vectors $(v_1, v_2, ..., v_n)$ doesn't matter? – B.Li Nov 05 '17 at 22:16
  • No it doesn't, changing the order might change the sign of the determinant, but that's what the absolute value is for – Hamed Nov 06 '17 at 23:47