6

How to prove that $e^{A \oplus B} = $$e^A \otimes e^B$? Here $A$ and $B$ are $n\times n$ and $m \times m$ matrices, $\otimes$ is the Kronecker product and $\oplus$ is the Kronecker sum: $$ A \oplus B = A\otimes I_m + I_n\otimes B, $$ where $I_m$ and $I_n$ are the identity matrices of size $m\times m$ and $n\times n$, respectively.

EDIT: Actually if you go to the page http://mathworld.wolfram.com/KroneckerSum.html it tells us this property is true.

http://digitalcommons.unf.edu/cgi/viewcontent.cgi?article=1025&context=etd

Appliqué
  • 8,576
  • 1
    What's your definition of $\mathrm{e}^A$? (Power series are common, as are other methods.) – Eric Towers Mar 12 '14 at 05:32
  • I know n*n matrices form a ring under certain conditions. – Adam Staples Mar 12 '14 at 05:32
  • 6
    Do $A$ and $B$ commute? More to the point, do you know how to prove this when $A,B$ are numbers? I don't mean "it is a law of exponents that..." The exponential is defined by a power series. Using that definition, do you know how to prove the equality? – Andrés E. Caicedo Mar 12 '14 at 05:36
  • 2
    If $[A,B]\ne0$ then an upgrade is needed: the BCH formula. – anon Mar 12 '14 at 05:37
  • Yeah it's simple using numbers. I've just never seen Matrices in an exponent like this before. We are talking about these operations living in a different Hilbert Space than the regular sums and addition laws. I'm not sure what steps I am aloud to take and then what kind of justification I should provide. – Adam Staples Mar 12 '14 at 05:41
  • Well since A + B are n by n matrices and they form a ring I know multiplication doesn't necessarily commute but addition does since all rings are abelian groups. This is actually on my friend's second semester Quantum mechanics homework. The math they do in these physics courses is quite different from what we math majors do in our math courses. – Adam Staples Mar 12 '14 at 05:44
  • @EricTowers It doesn't specify on the HW sheet. – Adam Staples Mar 12 '14 at 05:45
  • Like for example I wouldn't know how to justify that $e^{AI+IB}$ = $e^{AI}$$e^{IB}$ – Adam Staples Mar 12 '14 at 05:50
  • Have there been any lectures, other handouts, or sections of the text defining the exponential function applied to matrices? Alternatively, does the class have a prerequisite in which this definition would appear? – Eric Towers Mar 12 '14 at 05:54
  • http://mathworld.wolfram.com/MatrixExponential.html http://mathworld.wolfram.com/KroneckerSum.html http://www.proofwiki.org/wiki/Properties_of_the_Matrix_Exponential http://faculty.uml.edu/dklain/exponential.pdf

    I think these sites have an explanation, hmm...

    – Adam Staples Mar 12 '14 at 05:57
  • 1
    If $AB \neq BA$ it should be emphasized that the equation in your title is false. Instead, $e^Ae^B = e^{A+B+\frac{1}{2}[A,B] + \cdots}$ as sea turtles was pointing you towards... this is just the start. The full BCH is an unending series of nested commutators. To know the commutators is to be able to multiply the exponentials. This process is called exponentiation of the algebra to obtain the group... – James S. Cook Mar 12 '14 at 06:03
  • What do you mean by [A,B]? – Adam Staples Mar 12 '14 at 06:05
  • 2
    @AdamStaples $[A,B]=AB-BA$. – Marc van Leeuwen Mar 12 '14 at 06:13
  • @MarcvanLeeuwen I fixed the question. http://mathworld.wolfram.com/KroneckerSum.html It is true for all A,B that hold that sum property even including to Wolfram. I'm just unfamiliar with this Kronecker sum because the HW is unclear about this. Somehow this Kronecker Sum allows this to be true in some way. I'd agree without this summation property it would only be true for commuting matrices. Maybe this Kronecker Sum is always Commutative, Idk. – Adam Staples Mar 12 '14 at 06:31
  • 1
    These are not sum and products! The notation you were using has a different meaning. Thanks for clarifying. – Andrés E. Caicedo Mar 12 '14 at 06:32
  • 3
    But the question has not (yet) been fixed. You need to insert (and explain!) notably the symbol $\otimes$ (typed \otimes); anyone reading $AI+IB=A+B$ will say "yes, $AI=A$ and $IB=B$, big deal". Also note that $\oplus$ is not a commutative operation, at face value. – Marc van Leeuwen Mar 12 '14 at 07:23
  • @MarcvanLeeuwen Fixed. – Adam Staples Mar 12 '14 at 07:43

5 Answers5

9

What is to be proved is the following: $$ e^{A \otimes I_b +I_a \otimes B} = e^A \otimes e^B~$$ where $I_a,A \in M_n$ , $ I_b, B \in M_m$

This is true because $$ A \otimes I_b~~~~\text{and}~~~~ I_a \otimes B$$ commute, which can be shown by using the so called mixed-product property of the Kronecker product. i.e. $$ (A \otimes B)\cdot (C \otimes D) = (A\cdot C) \otimes (B\cdot D)~$$ Here, $\cdot$ represents the ordinary matrix product.

One can also show that for an arbitrary matrix function $f$, $$f(A\otimes I_b) = f(A)\otimes I_b~~~~\text{and}~~~ f(I_b \otimes A) = I_b \otimes f(A)~.$$ Together with the commutative property mentioned above, you can prove your result.

Jack Schmidt
  • 55,589
Nana
  • 8,351
  • Then we can take into account everyone else's information? I see. To prove the below results would we just write out the full matrices and use the definition of Kronecker products? – Adam Staples Mar 12 '14 at 07:37
  • The commutative property can be proved using the mixed-product property. – Nana Mar 12 '14 at 07:44
2

If $A$ and $B$ are $n\times n$, then by Taylor expansion we have:

$$e^A=\sum_{k=0}^{\infty}\frac{A^k}{k!}$$

Therefore:

$$e^Ae^B=\sum_{k_1=0}^{\infty}\frac{A^{k_1}}{k_1!}\sum_{k_2=0}^{\infty}\frac{B^{k_2}}{k_2!}$$
$$\Rightarrow e^Ae^B=\sum_{k_1=0}^{\infty}\sum_{k_2=0}^{\infty}\frac{A^{k_1}}{k_1!}\frac{B^{k_2}}{k_2!}$$
$$\Rightarrow =\sum_{k_1=0}^{\infty}\sum_{k_2=0}^{\infty}\frac{(k_1+k_2)!}{(k_1+k_2)!}\frac{A^{k_1}}{k_1!}\frac{B^{k_2}}{k_2!}$$

$$\Rightarrow =\sum_{k_1=0}^{\infty}\sum_{k_2=0}^{\infty}\frac{1}{(k_1+k_2)!}\binom{k_1+k_2}{k2}A^{k_1+k_2-k_2}B^{k_2}$$
Set $k=k_1+k2$ $$\Rightarrow =\sum_{k=0}^{\infty}\frac{1}{k!}(A+B)^{k}=e^{A+B}$$

Alt
  • 2,592
1

First and foremost, the result is not true as stated. It is only true of $A$ and $B$ commute, which is a very restrictive condition for matrices.

To handle the commutative case, one can first consider the formal power series case. In the ring $\Bbb Q[[X,Y]]$ of formal power series with rational coefficients in commuting indeterminates $X,Y$, one defines $\exp(X)$, $\exp(Y)$, and $\exp(X+Y)$ by the usual power series, and the identity $\exp(X)\exp(Y)=\exp(X+Y)$ is easily checked by comparing coefficients of an arbitrary monomial in $X,Y$: both series are equal to $\sum_{k,l\geq0}\binom{k+l}k\frac{X^kY^l}{(k+l)!}$.

Now if one restricts to formal power series with more than exponentially decreasing coefficients, substitution of a concrete value (for instance a matrix) for an indeterminate will give an absolutely convergent power series, whose limit assigns a well defined value to the substitution. If $M$ is your ring of matrices (which is also a topolgical $K$-vector space for $K=\Bbb R$ or $K=\Bbb C$), and $A,B\in M$ commute, then the substitutions $X:=A,Y:=B$ define, for the appropriate subring $R\subset\Bbb Q[[X,Y]]$, a continuous ring homomorphism $f:R\to M$, whose image lies in the commutative subring $K[A,B]$ of $M$ generated by $A,B$. This homomorphism then satifies $f(\exp(S))=\exp(f(S))$ (by the definition of matrix exponentiation), so that applying $f$ to $\exp(X)\exp(Y)=\exp(X+Y)$ gives $\exp(A)\exp(B)=\exp(A+B)$.

  • 1
    Actually, the result is true as stated (in particular, $A$ and $B$ need not commute). The reasoning is that $A$ and $B$ are acting on different subspaces (it is a Kronecker sum, not a regular matrix sum). – Physics Enthusiast Apr 02 '18 at 14:53
  • @PhysicsEnthusiast: I think this question did not read the way it does now at the time I wrote this answer, since I don't address Kronecker sums and products in any way. I'll be happy to delete the answer. – Marc van Leeuwen Apr 03 '18 at 11:39
0

A way to proceed. If $A$ and $B$ commute they are simultaneously diagonalizable (if they are diagonalizable, otherwise one must fall back to Jordan decomposition). For diagonal matrices the formula is easy, because you reduce to the property of exponential for real numbers.

0

Mumble! Gripe! Once again I seem to have answered the pre-edited version of the question! Ah well, at least I can take consolation in the fact that I do not appear to be alone!

I won't attempt to prove the title assertion, because it is false. I will however give a simple counterexample:

Let

$N_1 = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} \tag{1}$

and

$N_2 = \begin{bmatrix} 0 & 0 \\ -1 & 0 \end{bmatrix}; \tag{2}$

then we have

$N_1^2 = N_2^2 = 0, \tag{3}$

$N_1 N_2 = -\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \tag{4}$

and

$N_2 N_1 = -\begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}; \tag{5}$

note that

$N_1 N_2 \ne N_2 N_1. \tag{6}$

From (3) it follows that

$e^{N_1} = I + N_1 \tag{7}$

and

$e^{N_2} = I + N_2, \tag{8}$

so that

$e^{N_1} e^{N_2} = (I + N_1)(I + N_2) = I + N_1 + N_2 + N_1 N_2 = \begin{bmatrix} 0 & 1 \\ -1 & 1 \end{bmatrix}, \tag{9}$

as may be seen by a simple calculation using (1), (2), and (4). We also have the matrix $J$:

$J = N_1 + N_2 = \begin{bmatrix} 0 & 1 \\ -1 & 1 \end{bmatrix}; \tag{10}$

we see that

$J^2 = -I. \tag{11}$

Examining $e^J$, we see that

$e^{(N_1 + N_2)} = e^J = \sum_0^\infty \dfrac{J^n}{n!} = I + J + \dfrac{1}{2}J^2 + . . . + \dfrac{1}{n!}J^n + . . . , \tag{12}$

and by virtue of (11) we see that, term-by-term, the power series for $e^J$ corresponds precisely to that for $e^i$, $i^2 = -1$ the ordinary complex number square root of $-1$. This implies that the classic formula $e^{i\theta} = \cos \theta + i \sin \theta$ applies to (12) so that, when $\theta = 1$, we obtain

$e^J = I \cos (1 \; \text{rad}) + J \sin (1 \; \text{rad}) = \begin{bmatrix} \cos (1 \; \text{rad}) & \sin (1 \; \text{rad}) \\ -\sin (1 \; \text{rad}) & \cos (1 \; \text{rad}) \end{bmatrix} \tag{13}$

wherein $1 \; \text{rad} = 1 \; \text{radian}$. We see from these compuations that

$e^{(N_1 + N_2)} = e^J \ne e^{N_1}e^{N_2}. \tag{14}$

In the event that $AB = BA$, however, the title assertion binds, as may be seen by the following simple argument: let $X$ be the unique matrix solution to

$\dot X = (A + B)X, X(0) = I; \tag{15}$

it is easy to see that

$X(t) = e^{(A + B)t}; \tag{16}$

now setting

$Y(t) = e^{At}e^{Bt} \tag{17}$

we see that

$\dot Y = Ae^{At}e^{Bt} + e^{At}Be^{Bt} =$ $Ae^{At}e^{Bt} + Be^{At}e^{Bt} = (A + B)e^{At}e^{Bt} = (A + B)Y(t), \tag{18}$

since $AB = BA$ allows us to write $e^{At}B = Be^{At}$, swapping $B$ with powers $A^k$ of $A$ on a term-by-term basis. Since $X(t)$ and $Y(t)$ satisfy the same ordinary differential equation with the same initial conditions, we have $X(t) = Y(t)$ for all $t$; taking $t = 1$ now establishes the title assertion that

$e^Ae^B = e^{A+B}. \tag{19}$

Hope this helps. Cheerio,

and as always,

Fiat Lux!!!

Robert Lewis
  • 71,180
  • That's fine, keep it. Nana's post may make all of your posts useful for this question. Plus I'd like to learn more about matrix exponentiation, I've never heard of it before. – Adam Staples Mar 12 '14 at 07:39