Recall that for each vector $\omega\in\mathbb R^3$, there is an anti-symmetric matrix $ [\omega]_\times\in\mathbb R^{3\times 3}$ (and vice-versa) such that $$[\omega]_\times h= \omega\times h.$$ Matrix product on the left, cross product of vectors on the right. Let $\mathcal D$ be a symmetric and trace-free matrix (i.e. $\operatorname{tr}\mathcal D=\mathcal D_{11}+\mathcal D_{22}+\mathcal D_{33} = 0$). Then it is easy to check that $$ [\omega]_\times \mathcal D + \mathcal D[\omega]_\times$$ is also anti-symmetric.
My question: Is there a way (different from not-dupe) to show that in fact, $$ [\omega]_\times \mathcal D + \mathcal D[\omega]_\times = [-\mathcal D \omega]_\times?$$Or alternatively, that $ \omega\times(\mathcal Dh) + \mathcal D(\omega\times h) = (-\mathcal D\omega)\times h$ for all vectors $h$? I am hoping perhaps for a proof that uses identities involving trace-free/symmetric/antisymmetric matrices, without "directly computing the components" like in the link above.
The calculation in the link above is straightforward, and since there are only 9 components in a matrix, you don't even need the Einstein summation notation. But I think it would be nice to see.