2

Let $X_0, A \in \mathbf R^{n \times n}$ be symmetric positive semidefinite matrices, and consider the equation $$ \dot X = - X A X, \qquad X(0) = X_0. $$ Does this equation admit an explicit solution (possibly in terms of the eigenvalues and eigenvectors of $A$ or $X_0$)? If not, is there a standard way of approaching it, in order to for example deduce convergence rates to equilibrium?

Edit: diagonalizing $A$ as $V D V^T$ and introducing $Y = V^TXV$, we can write the following equation for $Y$: $$ \dot Y = - YDY, \qquad Y(0) = Y_0:= V^T X_0V.$$

  • Is it something like $X=(At+C)^{-1}$, where $\dot{x}=\frac{dx}{dt}$? – snulty Mar 10 '19 at 12:10
  • Maybe that's only the case when $x_0$ is invertible – snulty Mar 10 '19 at 12:20
  • Yes, I was thinking about this too, but it's already a good partial solution. Thank you very much! – Roberto Rastapopoulos Mar 10 '19 at 12:22
  • I was just thinking about the case where $X=f(At)$, and $f$ can be expanded in a power series, so that $A$ commutes with $X$, and then you can just solve the ode $\dot{X}=-AX^2$ which is separable. – snulty Mar 10 '19 at 12:27
  • actually now that I think about it, that might not work when $X_0$ and $A$ don't commute either. – snulty Mar 13 '19 at 17:47
  • Good point. For now I think your first comment covers my needs, but I'll revisit the question when I'll have more time. Thank you very much anyway! (and feel free to post your comment as an answer). – Roberto Rastapopoulos Mar 14 '19 at 18:16

1 Answers1

1

There was a silly typo in my original comment but after a discussion with a friend I think there's a way to arrive at an answer for some cases which includes the original comment.

Given $\dot{X}=-XAX$, we can try to write the solution $X$ as a power series. $X(0)=X_0$,

$$\dot{X}(0)=-X_0AX_0=(-1)^1 (X_0A)^1X_0$$ $$\ddot{X}(0)=+2X_0AX_0AX_0=(-1)^2\cdot 2\cdot (X_0A)^2X_0$$ $$\dddot{X}(0)=(-1)^3\cdot 6\cdot (X_0A)^3X_0$$ $$X^{(n)}(0)=(-1)^n\cdot n!\cdot (X_0A)^nX_0 $$

So without taking into account questions of convergence or invertibility for the moment, it seems like

$$X=\sum_{n=0}^{\infty}(-1)^n (X_0A)^nX_0t^n=(\mathbb{1}+X_0At)^{-1}X_0$$

You could also choose to factor out each of the derivatives as $X^{(n)}(0)=(-1)^n\cdot n!\cdot X_0 (AX_0)^n$ so that

$$X=\sum_{n=0}^{\infty}(-1)^n X_0(AX_0)^nt^n=X_0(\mathbb{1}+AX_0t)^{-1}$$

This answer by Robert Israel points out something to be wary of for non-hermitian positive semi-definite matrices, but it looks like from the last part of the answer $AX_0$ or $X_0A$ should have non negative eigenvalues and so it looks like it could be ok for something like $0\leq t \leq \rho(AX_0)^{-1}$ where $\rho(AX_0)$ is its spectral radius. I'm not very familiar with these things so maybe someone can correct me on this.

Although I think by using $\frac{d}{dt}U^{-1}=-U^{-1}\frac{dU}{dt}U^{-1}$, then it could be true in general that the solution is $X(t)=(\mathbb{1}+X_0At)^{-1}X_0$ or $X(t)=X_0(\mathbb{1}+AX_0t)^{-1}$.

snulty
  • 4,355
  • If $X_0$ is invertible (and positive definite), then $(1 + X_0 A t) = X_0(X_0^{-1} + At)$. Since $X_0^{-1} + At$ is always positive definite, this implies that $1 + X_0 A t$ is invertible with inverse $(X_0^{-1} + At)^{-1} , X_0$, right? – Roberto Rastapopoulos Mar 15 '19 at 11:48
  • Yeah I think so, so it should be consistent with the first comment as well. I just thought I said the constant matrix $C$ should be $X_0$ when it’s $X_0^{-1}$ but I didn’t originally it seems. – snulty Mar 15 '19 at 12:53
  • In fact, I was pointing out that the condition $0 \leq t \leq \rho(AX_0)^{-1}$ seems unnecessary? (As you say in your last paragraph). – Roberto Rastapopoulos Mar 15 '19 at 14:28
  • 1
    @RobertoRastapopoulos yeah I put it in for convergence of the series solution, but then if you take the answer from summing the series, and just differentiate that it still seems to work. That was all – snulty Mar 15 '19 at 15:05