1

$$X'=\begin{bmatrix} 1 & 0 & 1\\ 0 & 1 & 0\\ 0 & 0 & 1\\\end{bmatrix}X$$

This system is easy enough to solve without using the method of eigenvectors, but I thought it would be good practice for me. However, I ran into some difficulties.

This system has eigenvalue $\lambda=1$

It has two linearly independent eigenvectors: $v_1=(1,0,0)\\ v_2=(0,1,0)$

When I tried to find a generalized eigenvector for this system, I first tried the following: $$(A-I)w=v_2$$ where I denoted w to be my generalized eigenvector but this led me to the following matrix that I could not solve:$$\begin{bmatrix} 0 & 0 & 1\\0 & 0& 0\\ 0& 0& 0\\ \end{bmatrix}w= \begin{bmatrix}0\\1\\0 \end{bmatrix}$$ but I was unable to solve this equation. I then moved on to solve $$(A-I)w=v_1$$ $$\begin{bmatrix} 0 & 0 & 1\\0 & 0& 0\\ 0& 0& 0\\ \end{bmatrix}w= \begin{bmatrix}1\\0\\0 \end{bmatrix}$$

My initial thought was to have $$w=\begin{bmatrix}1\\1\\1 \end{bmatrix}$$

but this did not lead me to the correct solution using the approach outlined in Finding a solution basis.

Did I choose my $w$ wrong? And also, in the link I provided, does it matter which vector I choose to be $v_1$ and $v_2$?

Bobert
  • 127

2 Answers2

1

Put another way, your matrix is already in the form $A = D + N,$ where $D=I$ and $N$ is just that single 1 in the corner. The important thing is that the pieces commute, $DN = ND.$ If $B,C$ commute, then $e^{C+B} = e^C e^B= e^B e^C$ and $e^{(C+B)t} = e^{Ct} e^{Bt}= e^{Bt} e^{Ct}$

Well, $e^{Dt}$ is fairly quick. Then, you find $e^{Nt}$ by writing out the Taylor series. The series is finite since $N$ is nilpotent. $$ e^{Nt} = I + Nt + N^2 \frac{t^2}{2} + N^3 \frac{t^3}{6} +. $$

Will Jagy
  • 139,541
  • I am having difficulty seeing how calculating this Taylor Series is relevant to the problem. We have learned this in class, but I am not sure if I am quite grasping the concept. – Bobert Oct 08 '19 at 00:36
  • @Bobert let $Y$ be the square matrix with $Y(0) = I$ and $Y' = A Y,$ using your $A.$ Then we get $Y = e^{At}.$ I guess you meant $X$ to be a column vector. In that case, $X = e^{At} X(0) $ – Will Jagy Oct 08 '19 at 00:50
  • Sorry, I am still a bit confused. What are Y(0) and X(0)? – Bobert Oct 08 '19 at 01:00
1

Your vector $w=[1,1,1]^T$ is also correct, the basis is not unique. If you reduce the constants to the initial conditions, both eigen-bases should give the same result.

It is only when you introduce further simplifying assumptions such as orthogonality (as far as it gets, or the largest possible angles between basis vectors) that you can get a more constrained solution like $w=e_3$.

And yes, you could also have chosen another basis $v_1,v_2$ for the eigenspace. However, permuting the first and second dimension or the second and third directly establishes a Jordan normal form without further linear transformations, so it is the most simple to just stay with the canonical basis.

Lutz Lehmann
  • 126,666