0

I have a useful formula for gradients from Math for Machine Learning Book(Deisenroth et al.)
In 5.107 equation,

$ \frac{\partial x^T B x}{\partial x} = x^T(B+B^T), \ \ x \in \mathbb{R}^{a}, B \in \mathbb{R}^{a \times a} $

I can't understand how it can be solved like that.

Also, if I have inverse B,

$ \frac{\partial x^T B^{-1} x}{\partial x} = x^T(B^{-1}+(B^{-1})^T) $

Is this true?

0 Answers0