1

I cam across an interesting claim that:

$\mathcal N (\mu_f, \sigma_f^2) \; \mathcal N (\mu_g, \sigma_g^2) = \mathcal N \left(\frac{\mu_f\sigma_g^2+\mu_g\sigma_f^2}{\sigma_f^2+\sigma_g^2}, \frac {\sigma_f^2\sigma_g^2}{\sigma_f^2+\sigma_g^2}\right)$

In trying to understand it I consulted Bromiley:

http://www.tina-vision.net/docs/memos/2003-003.pdf

Bromiley concludes that:

if

$f(x) = \frac{1}{\sqrt{2\pi\sigma_f^2}} e^{-\frac{(x-\mu_f)^2}{2 \sigma_f^2}}$ and $g(x) = \frac{1}{\sqrt{2\pi\sigma_g^2}} e^{-\frac{(x-\mu_g)^2}{2 \sigma_g^2}}$

then:

$f(x)g(x) = D_{fg} \frac{1}{\sqrt{2\pi\sigma_{fg}^2}} e^{- \frac { (x - \mu_{fg})^2 } {2 \sigma_{fg}^2 } }$

where:

$\mu_{fg} = \frac { \sigma_g^2\mu_f + \sigma_f^2 \mu_g } {\sigma_f^2 + \sigma_g^2}$ and $\sigma_{fg}^2 = \frac {\sigma_f^2 \sigma_g^2} {\sigma_f^2 + \sigma_g^2}$

$S_{fg} = \frac {1} {\sqrt{2\pi(\sigma_f^2+\sigma_g^2)}} e^{ -\frac{(\mu_f-\mu_g)^2}{2(\sigma_f^2+\sigma_g^2)} }$

Note that if $\mu_f$, $\mu_g$ , $\sigma_f$ and $\sigma_f$ are known constants then the $S_{fg}$ is a known constant too.

To wit, if I cast Bromiley's result in the format of the claim I'm exploring:

$\mathcal N (\mu_f, \sigma_f^2) \; \mathcal N (\mu_g, \sigma_g^2) = S_{fg} \; \mathcal N \left(\frac{\mu_f\sigma_g^2+\mu_g\sigma_f^2}{\sigma_f^2+\sigma_g^2}, \frac {\sigma_f^2\sigma_g^2}{\sigma_f^2+\sigma_g^2}\right)$

In short there is a constant scaling factor $S_{fg}$. In fact Bromiley describes the product as a scaled Gaussian.

Given $f(x)$ and $g(x)$ are both functions of $x$ the original claim, which reads (as a reminder):

$\mathcal N (\mu_f, \sigma_f^2) \; \mathcal N (\mu_g, \sigma_g^2) = \mathcal N \left(\frac{\mu_f\sigma_g^2+\mu_g\sigma_f^2}{\sigma_f^2+\sigma_g^2}, \frac {\sigma_f^2\sigma_g^2}{\sigma_f^2+\sigma_g^2}\right)$

implies that:

$\int_{-\infty}^{\infty} f(x) g(x) \;dx = 1$

But Bromiley's result suggests this implication is false. I presume it inetgrates to S_{fg}, or:

$\int_{-\infty}^{\infty} f(x) g(x) \;dx = S_{fg}$

My tentative conclusion is that the claim I am exploring is false, and my questions would be:

  1. Is my tentative conclusion true? (is the explored claim false?)
  2. Am I right in concluding the integral would be $S_{fg}$?

Those are the areas I'm a little shakey on at present and seek some review on I guess.

Bernd Wechner
  • 379
  • 3
  • 11
  • Have a read here? https://math.stackexchange.com/questions/1112866/product-of-two-gaussian-pdfs-is-a-gaussain-pdf-but-product-of-two-gaussan-varia – Tony Hellmuth May 30 '18 at 01:14
  • There are several notions of "product" floating around in your post and the link given by Tony. 1. The product of two Gaussian random variables $X$ and $Y$ need not be Gaussian. But this is not relevant to your post and has nothing to do with products of PDFs. 2. The bivariate function $h(x,y) := f(x) g(y)$ is the PDF of a 2-dim Gaussian vector $(X,Y)$ whose components are independent. The integral of $h$ is thus $1$. But this is not the function in your post. Cont'd ... – angryavian May 30 '18 at 01:22
  • $f(x) g(x)$ is not a Gaussian PDF due to the scaling factor $S_{fg}$, but it is proportional to one, as you noted. I do not know the context of which your original "interesting claim" lies, but perhaps the equals sign there is not to be taken literally.
  • – angryavian May 30 '18 at 01:23
  • Good comments thanks. We're still short of an answer, but it seems that others have noticed the same issue. At least this site indulges in the confusion: https://ccrma.stanford.edu/~jos/sasp/Product_Two_Gaussian_PDFs.html and Lyndon White encountered a tutorial making the same claim as the one that spurred my curiosity here. As an aside I explored this with dependent variables here: https://math.stackexchange.com/questions/2793890/is-the-joint-pdf-of-two-normally-distributed-variables-a-pdf but am still a little perplexed on how widespread this confusions seems to be. – Bernd Wechner May 30 '18 at 02:12
  • The Tutorial that Lyndon White refers to has moved to: https://github.com/rlabbe/Kalman-and-Bayesian-Filters-in-Python/blob/master/04-One-Dimensional-Kalman-Filters.ipynb but still makes the same claim as I was exploring here, and which I conclude tentatively here, is spurious, though angryavian suggests perhaps I'm reading the equals signs to literally. – Bernd Wechner May 30 '18 at 02:27
  • @BerndWechner In the "$\text{posterior} \propto \text{likelihood} \cdot \text{prior}$" form of Bayes rule, note that the $\propto$ accounts for the scaling factor, so you are fine. I really dislike the loose language "product of two Gaussians is another Gaussian" etc. because it leads to precisely this kind of confusion. – angryavian May 30 '18 at 05:44