0

Let $A,B$ be two symmetric positive-definite matrices, and let

$$A=U\Lambda U^T,\qquad B=V\Gamma V^T$$

be their eigenvalue decompositions, where $U,V$ are orthogonal matrices and $\Lambda,\Gamma$ are diagonal, containing the (positive) eigenvalues of $A,B$.

By generating random matrices $A,B$ with these properties, I have numerical evidence that the following inequality seems to hold in general:

$$\mathrm{Tr}(AB) \le \mathrm{Tr}(\Lambda\Gamma)$$

Is it true? If so how can I prove it? (If not, please provide a counter-example).

a06e
  • 6,665
  • It's is true and is a special case of von Neumann's trace inequality. See this question for a proof, for instance. – user1551 Feb 14 '20 at 14:38
  • @user1551 Thank you! If you post that as an answer I'll accept it. – a06e Feb 14 '20 at 14:39
  • https://en.wikipedia.org/wiki/Trace_inequality#Von_Neumann's_trace_inequality_and_related_results – gangrene Feb 14 '20 at 14:40
  • @becko Thanks, but I think I'll close this question as an abstract duplicate of the linked one. – user1551 Feb 14 '20 at 14:41
  • Ah sorry I missed the link to the other question. Yes ok, I'll close myself. – a06e Feb 14 '20 at 14:47
  • 1
    Also a duplicate of https://math.stackexchange.com/questions/2558297/maximizing-the-trace-in-an-elegant-way which is a more direct and easier result (also: it plus cauchy-schwarz implies von - Neumann trace inequality) – user8675309 Feb 14 '20 at 19:36

0 Answers0