Let $A,B$ be two symmetric positive-definite matrices, and let
$$A=U\Lambda U^T,\qquad B=V\Gamma V^T$$
be their eigenvalue decompositions, where $U,V$ are orthogonal matrices and $\Lambda,\Gamma$ are diagonal, containing the (positive) eigenvalues of $A,B$.
By generating random matrices $A,B$ with these properties, I have numerical evidence that the following inequality seems to hold in general:
$$\mathrm{Tr}(AB) \le \mathrm{Tr}(\Lambda\Gamma)$$
Is it true? If so how can I prove it? (If not, please provide a counter-example).