6

This question comes from an exam in my functional analysis class.

Suppose $X$ is a Banach space, and $T \in B(X,X)$ is a bounded linear operator on $X$. For any non-negative integer $n$, let $$S_n=\sum_{k=0}^n \frac{1}{k!} T^k$$ where $T^k$ is the composition of $T$ with itself $k$ times and $T^0=I$.

We can show that for any integer $k>0$, $\Vert T^k \Vert \le \Vert T \Vert ^k$. Then we can show that $S_n \in B(X,X)$ and there is some $S \in B(X,X)$ such that $S_n \to S$. We write $S = e^T$ for this operator.

Finally, We were asked to show that $e^T$ has an inverse, and that it is $e^{-T}$. My thought is to prove the following claim first: if $A,B \in B(X,X)$ and $AB = BA$, then $e^A e^B = e^{A+B}$. If the claim is true, it follows that $e^T e^{-T}=I$.

The claim can be proven provided that the product series can be computed with the Cauchy rule.

$$e^Ae^B=\sum_{i=0}^{\infty}\frac{A^i}{i!}\sum_{j=0}^{\infty}\frac{B^j}{j!}=\sum_{k=0}^{\infty}\sum_{l=0}^{k}\frac{A^lB^{k-l}}{l!(k-l)!}$$ $$=\sum_{k=0}^{\infty}\frac{1}{k!}\sum_{l=0}^{k}\frac{k!}{l!(k-l)!}A^lB^{k-l}= \sum_{k=0}^{\infty}\frac{1}{k!}(A+B)^k= e^{A+B}$$

But why can the product series be summed in the Cauchy way? I know for real-number series, by Cauchy's theorem, if $\sum_{n=1}^{\infty} a_n$ and $\sum_{n=1}^{\infty} b_n$ are absolutely convergent to $A$ and $B$, respectively, then we can add $a_i b_j$ in any way, and the resulting series will converge to $AB$. Does this proposition still hold for commutable operators? (It would be greatly appreciated if ideas of proof or reference is suggested.)

Jiaqi Li
  • 888
  • 1
    This is not true in general, because the operators $A,B$ may not commute. Use the idea for $A=T$ and $B=-T$, and for absolute convergence of an operator series $\sum_kA_k$, use the convergence of $\sum_k|A_k|$, then all the theorems you know are still valid (and shown much the same way as in real analysis). –  Oct 06 '17 at 06:03
  • @ProfessorVector Thank you for your quick reply. What if $A$ and $B$ do commute? I can't do it the same way as in real analysis, because in that proof we compare real numbers, but operators cannot be compared. – Jiaqi Li Oct 06 '17 at 06:08
  • As I said: use the convergence of the series of the norms, norms are real numbers, moreover, $|AB|\le|A| |B|$. –  Oct 06 '17 at 06:12
  • See the finite dimensional case: https://math.stackexchange.com/questions/370817/commuting-in-matrix-exponential. – Martín-Blas Pérez Pinilla Oct 06 '17 at 07:28
  • Note that $T$ and $T^{-1}$ always commute. – Math1000 Oct 06 '17 at 09:06

2 Answers2

4

In any Banach algebra, the Cauchy product of two absolutely convergent series is absolutely convergent, and with the expected sum. That is, if $\sum_{j=0}^\infty \|a_j\|<\infty$ and $\sum_{k=0}^\infty \|b_k\|<\infty$, then defining $c_m = \sum_{j+k=m}a_jb_k$, we get an absolutely convergent series and $$ \sum_{m=0}^\infty c_m = \left(\sum_{j=0}^\infty a_j \right)\left(\sum_{k=0}^\infty b_k\right) \tag1 $$ The proof is literally the same as the proof for real/complex numbers. One doesn't even need $a_j$ and $b_k$ to commute to have (1), because they are always multiplied in the same order. But here is a proof anyway.

Proof : Let $A_n$, $B_n$, $C_n$ denote partial sums over indices $0,\dots,n$. Consider the difference $A_nB_n-C_n$. It consists of all terms $a_j b_k$ where $ j,k\le n$ and $j+k>n$. By the triangle inequality, it suffices to prove that $$ \sum_{j,k\le n, \ j+k>n} \|a_j\| \,\|b_k\| \tag2 $$ is small when $n$ is large. Since either $j$ or $k$ has to be $>n/2$, we can estimate (2) from above by $$ \sum_{j\le n,\ n/2<k\le n} \|a_j\| \,\|b_k\| + \sum_{n/2<j\le n,\ k\le n} \|a_j\| \,\|b_k\| \tag 3$$ which can be rewritten as $$ \sum_{j\le n} \|a_j\| \sum_{n/2<k\le n} \|b_k\| + \sum_{k\le n}\|b_k\| \sum_{n/2<j\le n} \|a_j\| \tag 4$$ As $n\to \infty$, the first factor in each product stays bounded while the second factor goes to zero.

  • Thank you! This proof is very clear, and it is sufficient for the question in my exam. I still wonder if the result for Cauchy product can be extended to any product $a_i b_j$ provided absolute continuity (as in real numbers). I think it might not be true, because Cauchy product is somehow special. – Jiaqi Li Oct 06 '17 at 21:06
  • "absolute continuity"? I do not understand what result do you want to prove. –  Oct 06 '17 at 22:23
  • Excuse me. I meant to say "absolute convergence". Say $\sum a_n$ and $\sum b_n$ are absolutely convergent. Can we deduce that summing $a_i b_j$ in any order will produce the same result? – Jiaqi Li Oct 06 '17 at 22:25
  • 1
    Yes. It's the same computation for operators as for numbers. Given $\epsilon>0$, there exists a finite set of terms $a_ib_j$ such that the rest of them have absolute value sum $<\epsilon$. So no matter how you order them, eventually partial sum will contain that finite set, and the remainder has effect $<\epsilon$. –  Oct 06 '17 at 23:38
  • Ah, I see. Thank you very much. The notion of "Banach algebra" is really helpful (I didn't learn it in my functional analysis class). – Jiaqi Li Oct 08 '17 at 02:42
  • I don't understand this answer. Would you please explain it with more detail? – Mina Nov 05 '21 at 16:56
1

Technically, the post itself only asks about showing that the inverse of $e^{S}$ is $e^{-S}$. First, you can check that the power series $$e^{tS}=\sum\limits_{n=0}^\infty\frac{t^n}{n!}S^n$$ makes sense for all $t\in\mathbb{R}, S\in\mathcal{L}(E).$ Differentiating in $t$ gives $$\frac{d}{dt}e^{tS}=Se^{tS}=e^{tS}S.$$ Next, differentiate $e^{(s+t)S}e^{-tS}$ in $t$. You'll get that it's zero, so it's constant in $t$, and evaluation at $t=0$ gives $$e^{(s+t)S}e^{-tS}=e^{sS}.$$ Evaluate this at $s=0$ to get that $e^{tS}e^{-tS}=I$, which provides the desired inversion property (take $t=1$).

Next, we'll answer the question in your title. Compute that $$\frac{d}{dt}(e^{t(S+T)}e^{-tT}e^{-tS})=e^{t(S+T)}Se^{-tT}e^{-tS}-e^{t(S+T)}e^{-tT}Se^{-tS}.$$ We claim this equals zero. Indeed, the fact that $ST=TS$ implies that $$e^{-tT}S=\sum\limits_{n=0}^\infty \frac{(-t)^n}{n!}T^nS=S\sum\limits_{n=0}^\infty \frac{(-t)^n}{n!}T^n=Se^{-tT},$$ from which it follows that $Se^{-tT}=e^{-tT}S$.

Hence, $$e^{t(S+T)}e^{-tT}e^{-tS}$$ is constant in $t$. Evaluating at $t=0$ gives that $$e^{t(S+T)}e^{-tT}e^{-tS}=I.$$ Next, multiply on the right by $e^{tS}$, then by $e^{tT}$, and use the inversion property that we established earlier. This shows that $$e^{t(S+T)}=e^{tS}e^{tT}.$$ Evaluating at $t=1$ gives the result that you wanted.

This latter argument can be done directly using the holomorphic functional calculus, as well.

cmk
  • 12,303