2

The following system is given:

$$ \dot{x} = y + z \\ \dot{y} = x + z \\ \dot{z} = x + y $$

The first thing I did was to find out the eigenvalues. I found out, that -1 is a doubled and 2 a single eigenvalue, so

$$ \lambda_{1,2} = -1,\ \ \lambda_3 = 2 $$

In the excercises ago, the ideas were to determine $ y=e^{\lambda x} \underline{u} $. so I tried to do the following:

$$ \begin{pmatrix} 0-\lambda & 1 & 1 \\ 1 & 0-\lambda & 1 \\ 1 & 1 & 0-\lambda \end{pmatrix} $$

Is this step right? I tried to find a scheme as in the excercises ago and in the line $ \dot{x} = y +z$ I don't have an x but one y and one z.

When inserting $ \lambda_1 = -1 $ I have

$$ A-\lambda E = \underline{0} \rightarrow \begin{pmatrix} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1 \end{pmatrix} = \underline{0} $$

which means that $ x_i + y_i + z_i = 0\ for\ i ={1,2,3} $. Here is the point on which I don't know how to go on. One solution is the trivial one, so $x=y=z=0$. Can I use this solution?

I think that I have to use something like $$ y = C_1 * \begin{pmatrix} u_1*e^{\lambda_1 x} \\ u_2*e^{\lambda_1 x}\\ u_3*e^{\lambda_1 x} \end{pmatrix} + C_2 \begin{pmatrix} ... \end{pmatrix} + C_3 \begin{pmatrix} ... \end{pmatrix}$$

in the case $ \lambda_1 = 1 $, but how to I get my u here exactly?

Harry Peter
  • 7,819
Drudge
  • 241
  • Do you know what eigenvectors are? – Git Gud Mar 07 '14 at 18:58
  • I know that eigenvectors are vectors which are the solution for $ (A- \lambda E)*u = \underline{0} $ for a given eigenvalue $ \lambda $, u is the eigenvector. – Drudge Mar 07 '14 at 19:01

2 Answers2

2

The equations can be written as $\dot{p} = Ap$, with $p \in \mathbb{R}^3$ and $A = \begin{bmatrix} 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{bmatrix}$.

Note that $A= v v^T -I$, where $v=(1,1,1)^T$, so it has one eigenvalue at 2 corresponding to the eigenvector $v$, and two at -1 corresponding to the eigenspace $\{v\}^\bot$. (Note that $A$ is symmetric hence has an orthonormal basis of eigenvectors.)

Hence if you write $p = \alpha v + w \in \mathbb{R}^3$, where $w \bot v$, you will have $Ap = \alpha 2 v -w$, and, in general, $A^k p = \alpha 2^k v + (-1)^k w$, from which we see that $e^{At} p = \alpha e^{2t} v+e^{-t} w$.

The projection of a point $p \in \mathbb{R}^3$ onto $v$ is straightforward to compute.

The point here is that you don't need to explicitly find eigenvectors for the eigenspace $\{v\}^\bot$.

copper.hat
  • 172,524
2

Your differential equation is of the form $\vec u'=A\vec u$, where $\vec u=\begin{bmatrix} x\\ y\\ z\end{bmatrix}$ and $A=\begin{bmatrix} 0 & 1 & 1\\ 1& 0 & 1\\ 1 & 1 & 0\end{bmatrix}$.

If there is a choice of eigenvectors of $A$ that form a basis of $\mathbb R^{3\color{grey}{\times 1}}$, then, assuming the eigenvectors are $v_1, v_2, v_3$ with corresponding eigenvalues $\lambda _1, \lambda _2, \lambda _3$, then a basis of solutions of $\vec u'=A\vec u$ is $\left(t\mapsto e^{\lambda _1 t}v_1, t\mapsto e^{\lambda _2 t}v_2, t\mapsto e^{\lambda _3 t}v_3\right)$.

I know beforehand that the hypothesis of the if above is verified, so this is just a linear algebra problem now.

You found $\lambda _1=-1, \lambda _2=-1$ and $\lambda _3=2$.

Now you need to find a basis of the eigenspaces of each eigenvalue.

$\bbox[5px,border:2px solid #0000FF]{\text{Eigenvalue }-1}$

You wish to find a basis of the vector space $\left\{\begin{bmatrix} a\\ b\\ c\end{bmatrix}\in \mathbb R^{n\times 1}\colon (A+I)\begin{bmatrix} a\\ b\\ c\end{bmatrix}=0_{n\times 1}\right\}$.

The condition $(A+I)\begin{bmatrix} a\\ b\\ c\end{bmatrix}=0_{n\times 1}$ is equivalent to $a+b+c=0$ and to $a=-b-c$, so the set above is equal to $\left\{\begin{bmatrix} -b-c\\ b\\ c\end{bmatrix}\colon b,c\in \mathbb R\right\}$ which is equal to $\left\{b\begin{bmatrix} -1\\ 1\\ 0\end{bmatrix}+c\begin{bmatrix} -1\\ 0\\ 1\end{bmatrix}\colon b,c\in \mathbb R\right\}$ and we can now immediately identify the basis $\left\langle \begin{bmatrix} -1\\ 1\\ 0\end{bmatrix},\begin{bmatrix} -1\\ 0\\ 1\end{bmatrix} \right\rangle$. So you can take $v_1=\begin{bmatrix} -1\\ 1\\ 0\end{bmatrix}$ and $v_2=\begin{bmatrix} -1\\ 0\\ 1\end{bmatrix}$.

$\bbox[5px,border:2px solid #0000FF]{\text{Eigenvalue }2}$

Do it yourself.

Git Gud
  • 31,356
  • Thank you for your answer. I tried it on my own: So I tried to find a solution for $ (A-2I) \begin{bmatrix} a \ b \ c \end{bmatrix} = 0 $. Hence $ \begin{pmatrix} -2 & 1 & 1 \ 1 & -2 & 1 \ 1 & 1 & -2 \end{pmatrix} \begin{bmatrix} a \ b \ c \end{bmatrix} = 0 $. This means that $ -2a +b + c = 0 $ and $ a -2b +c = 0 $ and $ a + b -2c = 0 $ what means that a = b = c. Is it therefore right that $ v_3 = \begin{bmatrix} 1 \ 1 \ 1 \end{bmatrix} $ ? – Drudge Mar 07 '14 at 20:01
  • @Drudge That's correct. More quickly and generally, if the rows of a matrix always add up to the same number (in this case $2$), then that number is an eigenvalue of the matrix associated to the vector in which all the entries are $1$. – Git Gud Mar 07 '14 at 20:05
  • That's a great trick I did not know yet. So is two a doubled eigenvalue because the rows $ \underline{and} $ the columns are added up equal to two? – Drudge Mar 07 '14 at 20:09
  • @Drudge No, unfortunately. However if the columns always add up to the same number, then that number is an eigenvalue of the matrix, but you can't tell what the eigenvector looks like. – Git Gud Mar 07 '14 at 20:30
  • That's a pity. However, your numeric example really helped me understanding this problem! Thank you a lot! – Drudge Mar 07 '14 at 20:52
  • Sorry, one last problem: A subexcercise was to find an orthonomized basis for the biggest eigenvalue. Is that the basis I wrote down for $ \lambda = 2 $ or do I have to use the Gram-Schmidt process for this problem? – Drudge Mar 07 '14 at 20:59
  • An orthonormized basis is one in which, among other things, all the vectors have norm equal $1$. Your basis has only one vector, so it suffices that that vector has norm $1$. Yours doesn't have norm $1$. To find a basis that suits what's asked just divide what you got by its norm. You can also use G-S if you want. – Git Gud Mar 07 '14 at 21:14