0

I am trying to prove that for positive-definite matrices $A$, $B$ and $C$,

$$\text{trace}(C\log(AB))=\text{trace}(C\log(A))+\text{trace}(C\log(B)).\quad(1)$$

While I don't know if it is a correct statement or not, I thought proving

$$\text{trace}(\log(AB))=\text{trace}(\log(A))+\text{trace}(\log(B))\quad(2)$$

can help me to prove $(1)$. But I couldn't solve $(2)$ as well. I saw this relationship on Wikipedia here with no proof, so I thought maybe the solution is very easy and I am missing something obvious.

Mah
  • 1,211
  • 2
    Where's you attempt to answer the question? You've got over 400 reputation points. Why haven't you learned yet that we expect from askers to include some context in their questions. Where is your's? – amWhy Oct 03 '17 at 16:52
  • I didn't know where to start as the matrices do not commute – Mah Oct 03 '17 at 16:54
  • 2
    @Mah this question is not well thought out: there is no background shown, no effort provided, etc. – Andres Mejia Oct 03 '17 at 16:54
  • @Mah that already says something. Does this mean you know a solution in the commutative case? What is it? Why does that make the problem more difficult/interesting? – Andres Mejia Oct 03 '17 at 16:55
  • @Mah usually in this site we will not do your homework for you and asking questions with no background and no effort from the OP to solve can be very disrespectful with the community. And because your high reputation this should already be clear. See this meta about asking questions in here. To me, asking questions like that makes you think that we are here to provide answers and work for you. – R.W Oct 03 '17 at 17:02
  • 2
    I am sorry that I didn't provide more background. Actually i am trying to solve a harder problem, and I thought solving this one can help me to solve the harder one. But I couldn't solve this as well. I saw this relationship on Wikipedia (https://en.wikipedia.org/wiki/Logarithm_of_a_matrix#Properties) with no proof, so I thought maybe the solution is very easy and I am missing something obvious. – Mah Oct 03 '17 at 17:07
  • 3
    Mah Why don't you include your last comment, and perhaps the original (harder problem) by editing it into your question. – amWhy Oct 03 '17 at 17:15

2 Answers2

2

If you prove that

Hint : The trace is a linear operator for all square matrices $A$ and $B$ and scalar $c$: $\text{tr}(A + B) = \text{tr}(A) + \text{tr}(B)$

then I think you can proceed from that. You should try to define the log of a matrix and use matrix knowledge. See here and here and try to prove the properties used. This should be a straightforward and good exercise.

EDIT: The more interesting case showed in the comments is the case were $A,B$ are square matrices that don't commute, so $AB \neq BA$ and in that case we don't have that $\log(AB) = \log(A)+\log(B)$ necessarily. But there is this important proposition

Proposition: Let $A$ be a square matrix with real (or complex) entries. Then it is true that $$\det(e^A) = e^{\text{tr}(A)}$$

See that $e^A$ is well-defined. Now if we take the log in both sides of the equation we get

$$\text{tr}(A) = \log(\det(e^A)) \tag{1}$$

Now suppose that $\log(AB)$, $\log(A)$ and $\log(B)$ are all well-defined. This means that $\Vert AB - I \Vert < 1$, $\Vert A - I \Vert < 1$ and $\Vert B - I \Vert < 1$. Were here we denote the operator norm in the space of square matrices with real (or complex) entries.

$$\Vert A \Vert := \sup_{u \neq 0, u \in \mathbb{C^n}}\frac{\Vert Au \Vert_{\mathbb{C}}}{\Vert u \Vert_{\mathbb{C}}}$$

$$\Vert u \Vert_ {\mathbb{C}} := \sqrt{\vert u_1\vert ^2 + \dots +\vert u_n \vert^2}$$

With the above considerations we get that the taylor series of the matrix converges.

We also have a lema:

Lema: Let $A$ be a square matrix with real (or complex) entries such that $\Vert e^A - I \Vert < 1$. Then we get that $$\log(e^A)= A$$

With that we can just set $A \to \log(AB)$ and then

$$\text{tr}(\log(AB)) = \log(\det(AB)) = \log(\det(A)\det(B)) = \log(\det(A)) + \log(\det(B))= $$ $$ = \log(\det(e^{\log(A)})) + \log(\det(e^{\log(B)})) \stackrel{(1)}{=} \text{tr}(\log(A)) + \text{tr}(\log(B))$$

Reference in Portuguese

R.W
  • 2,504
1

Let $U=\mathbb{C}\setminus \{z\in\mathbb{R};z\leq 0\}$ and let $Z_n$ be the subset of $M_n(\mathbb{C})$ constituted of the matrices without eigenvalues in $U$. We consider the principal log, which is uniquely defined by

for $n=1$. $re^{i\theta}\in U\rightarrow\log(re^{i\theta})=\log(r)+i\theta+2ki\pi$ where $\theta+2k\pi\in (-\pi,\pi)$.

For $n>1$, if $A\in Z_n$ is diagonalizable ($A=Pdiag(\lambda_i)P^{-1}$), then $\log(A)=Pdiag(\log(\lambda_i))P^{-1}$.

Remark 1. "$\log(A)$ is defined" does not imply that $||A-I||<1$.

Remark 2. If $A\in Z_n$, then $tr(A)$ is not necessarily $\log(\det(e^A))$. Indeed, let $n=2$, $A=2i\pi/3I_2$; then $tr(A)=4i\pi/3$ and $\log(\det(e^A))=\log(e^{4i\pi/3})=-2i\pi/3$. Yet, the result is true if $A$ is has only $>0$ eigenvalues.

Remark 3. Even if $AB=BA$ and $A,B\in Z_n$, $\log(AB)$ is not necessarily equal to $\log(A)+\log(B)$. For example, if $n=1$ and $A=e^{2i\pi/3}$, then $\log(A^2)=-2i\pi/3$ and $\log(A)=2i\pi/3$. Yet, if $A,B\in S⁺$ (they are symmetric $>0$) and $AB=BA$ then the result is true.

Proof of your equality (2). You can formally use the last two lines of the Rafael Wagner's post but, beware, that works because $A,B\in S⁺$ implies that $AB$ has only $>0$ eigenvalues (it's the key point!).

For you, it remains to prove (1) or at least have a look on (1); indeed, until now, I think that you are not too tired.

EDIT 1. @Mah , really, you are not serious. I thought you had done a few tests before conjecturing equality (1). I randomly chose positive matrices $ A, B, C $ and on first try, I found that (1) is not checked !! I will not answer your questions any more.

EDIT 2. We assume that the considered matrices are real.

Proposition. Let $A,B\in S^+$ and $U=\log(AB)-\log(A)-\log(B)$. Then equality (2)$\Leftrightarrow$

for every $C\in S$ (the symmetric matrices) $tr(CU)=0$

$\Leftrightarrow \log(AB)+\log(BA)=2(\log(A)+\log(B))$.

Proof. The fact that $S^+$ is open in $S$ gives the first equivalence. Thus $U$ is orthogonal to $S$ for the standard scalar product on $M_n$, that is equivalent to $U$ is a skew-symmetric matrix, and we are done.