5

I need help in proving the following theorem:

If $M(t)$ is an $n \times n$ matrix of differentiable functions, then
$$ \frac{d}{dt}\left( \exp(M(t))\right) = \frac{d}{dt}M(t) \exp(M(t)) = \exp(M(t)) M'(t) $$ if and only if $M(t)$ and $\frac{d}{dt} M(t)$ commute.

Please this is not homework. I am reading about the chain rule for matrix exponentials and I came across it.

Jonny
  • 71

3 Answers3

5

Clearly Avitus proves nothing at all !

The key is the derivative of $f(t)=M^k$. Indeed $f'(t)=\sum_{i=1}^k M^{i-1}M'M^{k-i}$. If $M'$ and $M$ commute, then $f'(t)=kM'M^{k-1}=kM^{k-1}M'$. $\exp$ is an entire function, then we can derive the associated series term by term. $(\exp(M))'=\sum_{i=1}^{\infty} (M^i/i!)'=\sum_{i=1}^{\infty}M'M^{i-1}/(i-1)!=\sum_{i=1}^{\infty}M^{i-1}/(i-1)!M'$ and we are done.

About the converse: assume that for every $t$, $(\exp(M))'=M'\exp(M)=\exp(M)M'$. Does this imply $M'M=MM'$ ? I don't know.

EDIT 1. Assume that if $\lambda,\mu$ are distinct eigenvalues of $M$, then $\exp(\lambda)\not=\exp(\mu)$ ; then $B$ and $M$ commute iff $B$ and $\exp(M)$ commute. Thus if $M$ is $2i\pi$ congruence free (that is, for every $\lambda,\mu\in spectrum(M)$, $\lambda-\mu\notin2i\pi\mathbb{Z}\setminus\{0\})$, then $M'M=MM'$. It remains to study the values of $t$ such that $M(t)$ is not $2i\pi$ congruence free. For instance take $M(t)=\begin{pmatrix}t^2&-2\pi\\2\pi&t\end{pmatrix}$. $\exp(M(0))=I_2$ commute with $M'(0)=\begin{pmatrix}0&0\\0&1\end{pmatrix}$ but $M'(0)$ and $M(0)$ don't commute. I think that, in general, there's not a lot of such values of t (here $t=0$); thus if we assume that $M'$ is continuous, then the result seems to be true. Moreover, note that $(\exp(M))'_{t=0}=1/2I_2$, that is the equality $(\exp(M))'=\exp(M)M'$ is not satisfied when $t=0$.

EDIT 2. Let $ad(M):H\in\mathcal{M}_n:H\rightarrow MH-HM$. Then the derivative of $\exp$ is $D(\exp(M))'(H)=\exp(M)\phi(M)H$ where $\phi(M)=\dfrac{1-\exp(-ad(M))}{ad(M)}$. Here $(\exp(M))'=\exp(M)\phi(M)M'=\exp(M)M'$, that implies $\phi(M)(M')=M'$, that is $M'$ is in the eigenspace of $\phi(M)$ associated to the eigenvalue $1$ (assume $M'\not=0$). Let $(\lambda_i)_i$ be the spectrum of $M$. Then the spectrum of $ad(M)$ is $(\mu)_{i,j}=(\lambda_i-\lambda_j)_{i,j}$ and the spectrum of $\phi(M)$ is $(\rho_{i,j})_{i,j}=(\dfrac{1-\exp(-\mu_{i,j})}{\mu_{i,j}})_{i,j}$. Note that $\rho_{i,j}=1$ implies $\mu_{i,j}=0$ that is the case if $i=j$ OR $\mu_{i,j}\not=0$ and $\exp(-\mu_{i,j})=1-\mu_{i,j}$, equation that has solutions in $\mathbb{C}$.

EDIT 3. OK , the solution is given in W. Ma ,B. Shekhtman. Do the chain rules for matrix functions hold without commutativity ? Linear and multilinear algebra.Vol 58, 1,79-87,2010.

$(A\exp(A)-\exp(A)A)'=0=A'\exp(A)+AA'\exp(A)-\exp(A)A'A-\exp(A)A'=[A,A']\exp(A)$ implies that $[A,A']=0$

Moreover $(\exp(A))'=A'\exp(A)$ does not imply $[A,A']=0$. Take $A=\begin{pmatrix}c&F(t)\\0&0\end{pmatrix}$ where $c\not=0,\exp(c)=1+c$. Note that $A'\exp(A)\not=\exp(A)A'$. The previous counter-example is in $M_2(\mathbb{C})$ ; yet one can deduce a counter-example in $M_2(\mathbb{R})$.

0

to close the loop loup blanc started, let us suppose that the derivative is satisfied and look at loup's derivative series. You can slowly sneak each of the $M'$ out to the right of the series by replacing it with $M' M = M M' + [M, M']$ (check my signs here). Eventually you end up with a bunch of commutators, and since these end up looking very $[M, M'] ^ n$ for arbitrary $n$ the only way these terms can all vanish for arbitrary $M$ is if $[M, M'] = 0$.

webb
  • 134
  • you obtain for instance: $M'M^3=M^3M'+M^2[M',M]+M[M',M]M+[M',M]M^2$. What do you do with that ? –  Nov 08 '13 at 20:26
  • Well, you have assumed that the derivative is equal to the exponential times $M'$, so now you have $\exp(M) M' + \propto [M', M]$ and because you've assumed that the exponential commutes with $M'$, you are left to conclude that the terms proportional to $[M', M]$ have to vanish for arbitrary $M$. – webb Nov 08 '13 at 22:45
  • read my edit above. You will find a counterexample (for $t=0$) that proves that your reasoning is false. –  Nov 08 '13 at 22:57
  • Hi webb. More precisely, I think that you want to prove that $M'\exp(M)=\exp(M)M'+\alpha[M',M]$. Where is the proof ? –  Nov 08 '13 at 23:52
  • When I said $\propto [M, M']$ I did not mean a proportionality constant, I meant a series that contains first powers of the commutator in them. The converse statement, going backwards, is that you have assumed that $M' e^M = e^M M' = (e^M)'$, and so what I have said is compute $(e^M)'$ in general, and then show what is required for it to equal $M' e^M = e^M M'$ which is that the commutators vanish. – webb Nov 09 '13 at 23:17
  • OK webb, I think that the best we can do is stop the discussion. –  Nov 10 '13 at 10:30
-1

I show the first equality, to start with.

Using the definition

$$\frac{d\exp(M(t))}{dt}=\lim_{h\rightarrow 0}\frac{\exp(M(t+h))-\exp(M(t))}{h}=\lim_{h\rightarrow 0}\frac{\exp(M(t+h)-M(t)+M(t))-\exp(M(t))}{h}=\\ \left(\lim_{h\rightarrow 0}\frac{\exp(M(t+h)-M(t))-1}{h}\right)\exp(M(t));$$

Using the definition of derivative and exponential function

$$M(t+h)-M(t))=\frac{dM}{dt}h+O(h^2) $$ $$\exp(M(t+h)-M(t))=1+\frac{dM}{dt}h+O(h^2)$$

we arrive at

$$\frac{d\exp(M(t))}{dt}=\left(\lim_{h\rightarrow 0}\frac{\exp(M(t+h)-M(t))-1}{h}\right) \exp(M(t))=\\ \left(\lim_{h\rightarrow 0}\frac{dM}{dt}+O(h)\right) \exp(M(t))=\frac{dM}{dt}\exp(M(t)). $$

Avitus
  • 14,018
  • Going from your second line to your third line, you're implicitly using $\exp(A(t) + B(t)) = \exp(A(t))\exp(B(t))$ which is not true in general; in particular, it holds if (and only if?) $A(t)$ and $B(t)$ commute. – BaronVT Nov 06 '13 at 23:20
  • this is a good point: should I specify it a bit more? – Avitus Nov 07 '13 at 06:52
  • @BaronVT $exp(A(t)+B(t))=exp(A(t))exp(B(t))$ which is not true in general; in particular, it holds if $A(t)$ and $B(t)$ commute. The converse is false. –  Nov 08 '13 at 18:49