Given, $\sigma=\left[ \begin{array}{ccc} 0 & 0 & \alpha\beta^2 \\ 0 & 0 & -\alpha\beta \\ \alpha\beta^2 & -\alpha\beta & 0 \end{array} \right]$, I am interested in the eigenvalues as well as eigenvectors of the stress tensor. $\lambda_1=0, \lambda_2=\alpha\beta\sqrt{1+\beta^2}, \lambda_3=-\alpha\beta\sqrt{1+\beta^2}$. But i need to construct a system of mutually orthogonal vectors using the eigenvectors. I know that if I have $\vec{x_1}, \vec{x_1}, \vec{x_2}$, then $\vec{x_1}, \vec{x_2}, \vec{x_1}\times \vec{x_2}$ form a set of mutually orthogonal vectors. If $\vec{x_1}, \vec{x_1}, \vec{x_1}$, then $\vec{x_1}, \vec{x_2}, \vec{x_3}$ form a set of mutually orthogonal vectors, where $\vec{x_2}, \vec{x_3}$ are any orthogonal vectors lying in any plane perpendicular to $\vec{x_1}$. But how to find the eigenvectors and a set of mutually orthogonal vectors in the present case?
2 Answers
$\sigma$ is symmetric, hence eigenvectors corresponding to distinct eigenvalues are automatically orthogonal, hence you have little choice in the matter other than scaling.
Assuming that $\alpha, \beta \neq 0$, then just by inspection we can find a solution to $\sigma x = 0$, for example $x=(\alpha \beta, \alpha \beta^2,0)^T \in \ker \sigma$ (that is corresponding to the zero eigenvalue).

- 172,524
-
How to solve for the other two non-zero eigenvalues? – user_1_1_1 Nov 02 '14 at 19:00
-
Grind :-). Note that you can factor $\alpha$ out of everything, so you can reduce the notational clutter if you are careful. – copper.hat Nov 02 '14 at 19:01
-
Can you refer to a proof of symmetric matrices having mutually orthogonal eigenvectors? – user_1_1_1 Nov 02 '14 at 19:01
-
Google is your friend, or http://math.stackexchange.com/a/82471/27978. – copper.hat Nov 02 '14 at 19:03
-
Your reply was very insightful. – user_1_1_1 Nov 02 '14 at 19:19
-
@GoCodes: Glad to be able to help. – copper.hat Nov 02 '14 at 20:04
Assuming $\alpha\beta\ne 0$ we have:
$$\left( \begin{array}{ccc} 0 & 0 & \alpha\beta^2 \\ 0 & 0 & -\alpha\beta \\ \alpha\beta^2 & -\alpha\beta & 0 \end{array} \right)\left( \begin{array}{c} \frac{1}{\beta} \\ 1 \\ 0 \end{array} \right) =0 \left( \begin{array}{c} \frac{1}{\beta} \\ 1 \\ 0 \end{array} \right)$$
$$\left( \begin{array}{ccc} 0 & 0 & \alpha\beta^2 \\ 0 & 0 & -\alpha\beta \\ \alpha\beta^2 & -\alpha\beta & 1 \end{array} \right)\left( \begin{array}{c} \frac{-\beta}{\sqrt{1+\beta^2}} \\ \frac{1}{\sqrt{1+\beta^2}} \\ 1 \end{array} \right) =-\alpha \beta \sqrt{1+\beta^2} \left( \begin{array}{c} \frac{-\beta}{\sqrt{1+\beta^2}} \\ \frac{1}{\sqrt{1+\beta^2}} \\ 1 \end{array} \right)$$
$$\left( \begin{array}{ccc} 0 & 0 & \alpha\beta^2 \\ 0 & 0 & -\alpha\beta \\ \alpha\beta^2 & -\alpha\beta & 0 \end{array} \right)\left( \begin{array}{c} \frac{\beta}{\sqrt{1+\beta^2}} \\ \frac{-1}{\sqrt{1+\beta^2}} \\ 1 \end{array} \right) =\alpha \beta \sqrt{1+\beta^2} \left( \begin{array}{c} \frac{\beta}{\sqrt{1+\beta^2}} \\ \frac{-1}{\sqrt{1+\beta^2}} \\ 1 \end{array} \right)$$
If $\alpha\beta=0$ the matrix is the zero matrix. In such a case any vector is an eigenvector with corresponding eigenvalue zero.

- 29,399