0

I've been having trouble solving this exercise: Find the solution of $$ \dot{\bf{x}}(t) = A\bf{x}(t)+\bf{c}\delta(t-1)$$ where $A = \begin{bmatrix} 2 & 0 & 1 \\ 0 & 2 & 0 \\ 0 & 0 & 2 \end{bmatrix}, \bf{c} = {(1,1,1)}, \bf{x}(t=0)= {(1,0,1)}$. Now I followed these steps: For $t > 1$ the delta term vanishes and we are left with $\dot{\bf{x}}(t) = A\bf{x}(t)$, whose general solution is given by: $$ \bf{x}(t) = e^{tA}\bf{x}(0)$$ Now I need to compute the matrix exponential. I noticed that: $$A= 2Id_3 + B, B =\begin{bmatrix} 0 & 0 & 1 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}$$ It's easy to notice that $[1]$: $$B^k = [0], k \ge2 $$ Now we have: $$e^{tA} = \sum_{n=0}^{\infty}\frac{t^n}{n!}A^n = Id_3 + \sum_{n = 1}^{\infty} \frac{t^n}{n!}(2Id_3+B)^n$$ Now, since the identity matrix is idempotent, we get: $$ (2Id_3+B)^n = Id_3\sum_{k =0}^{n} 2^n \left(\frac{B}{2}\right)^k $$ Therefore we have: $$e^{tA} = Id_3\left(1 + \sum_{n=1}^{\infty}\frac{(2t)^n}{n!} \sum_{k =0}^{n} \left(\frac{B}{2}\right)^k \right) $$ Now I do recognize that I need to isolate the term $$\sum_{n=1}^{\infty}\frac{(2t)^n}{n!} = e^{2t}-1$$ nevertheless, I can't figure out how to compute the second sum with index $k$. The only thing I know is that for $k>2$ all the terms of the sum vanish because of condition $[1]$. But I don't know how to go on. If someone could help, It would be much appreciated.

Gonçalo
  • 9,312

1 Answers1

1

First, regarding the matrix exponential: a simpler approach is to note that $2\operatorname{Id}_3$ and $B$ commute (i.e. $(2 \operatorname{Id}_3) B = B (\operatorname{Id}_3)$), from which it follows that \begin{align} \exp[t(2\operatorname{Id}_3 + B)] &= \exp[2t\operatorname{Id}_3 + tB] = \exp(2t \operatorname{Id}_3)\exp(tB) \\ & = \left(e^{2t}\operatorname{Id}_3\right) \left(I + tB\right) = e^{2t} \pmatrix{1&0&t\\0&1&0\\0&0&1}. \end{align} Second, your attempt to account for the $\delta(t-1)$ term is incorrect. Here is one approach (the use of a generalized "integrating factor") that yields the correct answer: $$ \dot{\mathbf x} = A\mathbf x(t) + \mathbf c\delta(t-1)\\ \dot{\mathbf x} - A\mathbf x(t) = \mathbf c\delta(t-1)\\ e^{-At}\dot{\mathbf x} - A e^{-At}\mathbf x = e^{-At}\mathbf c \delta(t-1)\\ \frac {d}{dt}\left[e^{-At}\mathbf x \right] = e^{-At}\mathbf c \delta(t-1)\\ e^{-At}\mathbf x = \left[e^{-At}\mathbf x\right]_{t = 0} + \int_0^t e^{-At}\mathbf c \delta(t-1)\,dt\\ e^{-At}\mathbf x = \mathbf x(0) + \int_0^t e^{-At}\mathbf c \delta(t-1)\,dt\\ \mathbf x(t) = e^{At}\mathbf x(0) + e^{At}\int_0^t e^{-At}\mathbf c \delta(t-1)\,dt. $$ Now, with the "sifting property" of $\delta$, we find that for $t > 1$ the integral term is given by $$ e^{At}\int_0^t e^{-At}\mathbf c \delta(t-1)\,dt = e^{At} e^{-A(1)}\mathbf c = e^{A(t-1)} \mathbf c. $$

Thus, the result can be expressed as follows: $$ \mathbf x(t) = \begin{cases} e^{At}\mathbf x(0) & t < 1,\\ e^{At} \mathbf x(0) + e^{A(t-1)} \mathbf c & t > 1. \end{cases} $$ How exactly you decide on the behavior at the trajectory's discontinuity (at $t = 1$) is a matter of competing conventions.

Ben Grossmann
  • 225,327
  • Where did you use the fact that the two matrices commute? Furthermore, have you omitted some calculations to get $e^{2t}Id_3(Id_3+tB)$? Is it really correct to use the usual exponential property $e^{a+b} = e^a e^b$ right before as you did or are we under some particular circumstances such that it is made possible? I obtained your final result by applying $f(A) = \sum_{k=0}^{\infty} a_kA^k$, therefore getting $$ e^{2tId_3} = Id_3(1+\sum_{n=1}^{\infty}\frac{(2t)^n}{n!}) = Id_3 e^{2t}, e^{tB}=...[1]...= Id_3 + tB $$. I have the feeling that somehow you avoided making these calculations – Claudio Menchinelli Jun 23 '23 at 08:21
  • After your solution I tried completing the excercise by myself and obtained the same result as you by considering: $$U(t) = e^{tA} \Rightarrow \bf{x}(t) = U(t)\bf{x}(0) + \int_{0}^{t} U(t-t')\bf{c}\delta(t-t')dt' $$. Using the sifiting property and considering the two cases $t>1, t < 1$, I got: $$ \bf{x}(t) = U(t)\bf{x}(0) + \bf{c}H(t-1)U(t-1)$$, where $H(x)$ denotes the Heaviside step function – Claudio Menchinelli Jun 23 '23 at 08:51
  • @ClaudioMenchinelli The general fact that I've used is that if $PQ = QP$, then $e^{P + Q} = e^Pe^Q$. The fact that $P$ and $Q$ commute are the "particular circumstances". Regarding the avoidance of computations, I did happen to know that $e^{t\operatorname{Id}} = e^t \operatorname{Id}$ and used that fact, but it's not difficult to verify, as you have found. – Ben Grossmann Jun 23 '23 at 14:59
  • @Claudio It's not clear how we should interpret $\mathbf c \mathbf U(t-1)$, since that order is not a conformable matrix product when $\mathbf c$ is taken to be a column-vector. However, if you switch that order to get $\mathbf U(t - 1) \mathbf c$, then your answer with the Heaviside function is indeed correct. – Ben Grossmann Jun 23 '23 at 15:01
  • Yes I completely forgot about its vector nature and treated it as a constant. Thanks for making me notice it. Only one question: I still cant see why Commutativity could allow us to somehow apply the properties of the exp function in the real or complex fields to matricies. I dont know if this is something trivial I am just missing, or if it is deeper I do not know – Claudio Menchinelli Jun 23 '23 at 20:28
  • Something deeper, pardon my hideous use of English – Claudio Menchinelli Jun 23 '23 at 20:34
  • @Claudio This is a well known fact about the matrix exponential, but it is not something I would call "trivial". Two proofs of this fact are presented as answers on this post. – Ben Grossmann Jun 23 '23 at 20:37
  • I will look them up later. In the meantime, Id like to thank you for all the help and availabilty – Claudio Menchinelli Jun 23 '23 at 21:06