2

Is there an obvious trick I am missing for solving the following integral:

$$ \int_x P(y|x) W(x) (-x^TMx+2x^Tm -c)dx$$

Distributions are Gaussians and $M$ is symmetric.

I know how to do the expectation for a univariate Gaussian but I'm not sure of this multivariate case since $$ P(y|x) W(x) = V(y,x) $$

$W$ is the PDF of $x$, $P$ is a conditional PDF of $y$ given $x$ and $V$ is the joint PDF of $x$ and $y$.

  • Im sure this is answered before on this site, but on my mobile Phone now so too burdensone to search – kjetil b halvorsen Oct 04 '15 at 11:11
  • Is $P$ th density? Waht is $W, V$? Please define! – kjetil b halvorsen Oct 04 '15 at 15:41
  • $W$ is the PDF of $x$, $P$ is a conditional PDF of $y$ given $x$ and $V$ is the joint distribution density. All Gaussians. –  Oct 04 '15 at 15:44
  • Did you look at http://math.stackexchange.com/questions/442472/sum-of-squares-of-dependent-gaussian-random-variables/442916#442916 ? And, what is the point of using $y$ in th expression above, when the quadratic form you want the expectation of do not depend on $y$? It shoul b irrlevant! – kjetil b halvorsen Oct 04 '15 at 15:52
  • 1
    But the function $P(y|x)$ depends on $x$. We shouldn't be able to discard it and apply the expectation under $W(x)$ only. I mean now just from a calculus stand point, $P$ is inside the integral and it depends on $x$. –  Oct 04 '15 at 16:04
  • OK. Why do it arise in that form, in which context do you encounter this integral? – kjetil b halvorsen Oct 04 '15 at 16:10

2 Answers2

1

So it seems there is a very easy trick to solve this. We only need to use the Bayes rule to get the marginal distribution $P(y)$: $$ P(y)W(x|y) = P(y|x)W(x)$$

$P(y)$ can then be taken outside the integral and we end up with an expectation over a univariate conditional Gaussian $W(x|y)$ which can be calculated for example as stated in the matrix cookbook.

0

So, you have a multinormal vector $(y,x)$ (where both components $y, x$ can be vectors), and want the expectation of the quadratic form $-x^T M X +2x^T m - c$, $M$ an symmetric constant matrix, $m$ a constant vector and $c$ a constant scalar. Since this function (random variable) you want the expectation of, do not depend on $y$, the distribution of $y$ is irrelevant for the expectation. For the first part, you can just apply my answer to sum of squares of dependent gaussian random variables Then $\text{E} x^T m$ is simply $\text{E}(x)^T m$.