2

I have a question of differential equations of the form.

$\textbf{x}'(t)=A*\textbf{x(t)}$, where x is an n-dimensional matrix, and A is an n*n real matrix. I have learned to solve this if a is diagonalizable, with n independent eigenvectors. Then I get that the solltion is:

$\textbf{x(t)}=C_1\textbf{v}_1e^{\lambda_1t}+...+C_n\textbf{v}_ne^{\lambda_nt}$.

  1. This holds even if the eigenvectors and values are complex?, as long as the vectors are linearly independent and we have n of them?

  2. If the matrix is not diagonalizable is it possible to find an analytical sollution, or do you have to use numerical solutions then?

user119615
  • 10,176
  • 5
  • 44
  • 112
  • Do you know about Jordan Normal Form? – Git Gud Apr 07 '14 at 22:59
  • @GitGud No I only know LU factorisation, diagonalisation, orthogonal diagonalisation and the singular value decomposition. Is it so that with Jordan Normal Form, you can solve all the differential equations? – user119615 Apr 07 '14 at 23:04
  • In theory you don't even need JNF. It just helps with computations. – Git Gud Apr 07 '14 at 23:05

1 Answers1

4

Given $n\in \mathbb N, A\in \mathbb R^{n\times n}$, a non trivial interval $I, t_0\in I, y_0\in \mathbb R^n$ and $b\colon I\to \mathbb R^n$ a continuous function, consider the initial value problem $$y'+Ay=b, y(t_0)=y_0.$$

Let $f\colon I\to\mathbb R^n$ be a differentiable function.

Fact: For all $t\in \mathbb R$, $e^{At}$ is invertible and $\left(e^{At}\right)^{-1}=e^{-At}$.

There exists $C\in \mathbb R^n$ such that for all $t\in I$ the following holds: $$\begin{align} f'(t)+Af(t)=b(t)&\iff e^{At}(f'(t)+Af(t))=e^{At}b(t)\\ &\iff e^{At}f'(t)+e^{At}Af(t)=e^{At}b(t)\\ &\iff e^{At}f'(t)+Ae^{At}f(t)=e^{At}b(t)\\ &\iff \int \limits _{t_0}^te^{As}f'(s)+Ae^{As}f(s)\mathrm ds=\int \limits_{t_0}^te^{As}b(s)\mathrm ds+C\\ &\iff e^{At}f(t)=\int \limits_{t_0}^te^{As}b(s)\mathrm ds+C\\ &\iff f(t)=e^{-At}\int \limits_{t_0}^te^{As}b(s)\mathrm ds+e^{-At}C. \end{align}$$

Taking into account $f(t_0)=y_0$ after some simple calculations it follows that $C=e^{At_0}y_0$.

This explicitly finds a solution and it shows it is unique.

All you need to do is compute matrices exponentials. No numerical results are needed if you can find the antiderivative on the RHS and certainly that's not a problem when $b$ is the null function.

As for question $1$ it indeed the same if you find complex eigenpairs, but in this case, if you want real solutions, you need to take real and imaginary parts to get them.

Git Gud
  • 31,356
  • Thank you very much! I must admit I did not understand all of that, but I will try and study this subject more so that I can solve all the equations. I just have one more simle question if it is ok for you. When we solve ordinary differential equations, we have 3 cases, 2 real roots, 2 complex roots, or one double root. Is it the same in the case where we have this kind of matrix differential equations? I can not quite see what would be equivalent of a double root in this case? Again, thanks for your help! – user119615 Apr 07 '14 at 23:36
  • @user119615 You can ask me what you didn't understand. It might help to thing of everything as functions $\colon \mathbb R\to \mathbb R$ first, the proof is the same. The one I presented as some subtleties, (for example the fact that $A$ and $e^{At}$ commute), but if you understand the one dimensional case, you should be able to deal with this one. – Git Gud Apr 07 '14 at 23:43
  • @user119615 Is it the same in the case where we have this kind of matrix differential equations? It's similar. Here what matters is not so much when you have repeated eigenvalues, but rather if the eigenspaces have full dimension, as an example, if an eigenvalue having multiplicity $3$, implies that you can get three linearly independent vectors. When this happens for all eigenvalues, it means the matrix is diagonalizable. If it isn't, you get something like for the one dimensional case: multiplying by $t$, multiplying by $t^2$ and so on. This is where jordan normal form comes into play. – Git Gud Apr 07 '14 at 23:45
  • What I did not understand was the $e^{At}$. From the link you put in it seems, that is an infinite sum of matrices. In which course do we learn about these kind of matrices? What I know about differentiating and integrating infinite series is about when they are scalars and not matrices, and in those cases in order for us to be allowed to integrate we need uniform convergence. Did you learn about these things in a special course?, was it an analysis course, or a linear algebra course? – user119615 Apr 08 '14 at 11:54
  • @user119615 Knowledge about this is standard when dealing with systems of ODEs. The definition is $e^M:=\sum \limits_{n=0}^\infty\left(\dfrac 1{n!}M^n \right)$. It works out the same as if $M$ were a real number. I personally learned it in a differential equations course and briefly talked about it in a matrix analysis course. – Git Gud Apr 08 '14 at 11:58
  • There are techniques to explicitly find $e^M$. For instance if $M=PDP^{-1}$ where $D=\text{diag}(\lambda _1, \ldots ,\lambda _n)$ is a diagonal matrix, one can prove that $e^M=Pe^DP^{-1}$ and $e^D=\text{diag}\left(e^{\lambda _1}, \ldots ,e^{\lambda _n}\right)$. – Git Gud Apr 08 '14 at 12:05