1

Lets define the following probability density distributions as: $$ \begin{align} p(θ) &= N(θ; 0,1) \\ q(θ) &= N(θ; μ,σ^2)\\ p(y|θ,x) &= N(y; θx,σ_n^2) \end{align} $$ where $N(x; m,v)$ means $x$ is a univariate Gaussian random variable density distribution with mean $m$ and variance $v$.

Calculate variational free-energy $$ F(q(θ)) = \int dθ\, q(θ) \left[\log q(θ) - \log p(θ) - \log p(y|θ,x)\right] $$

According to Difference between Variance and 2nd moment, I cannot simplify $\int dθ\, q(θ)(θ - μ)^2$ to $σ^2$, because $μ$ is not $0$?

According to https://gregorygundersen.com/blog/2020/09/01/gaussian-entropy/, this is also the assumption when calculating the entropy: $$ H(θ) = - ∫ q(θ) \log q(θ) dθ = 0.5 \log(2πσ^2) + 0.5 $$ meaning I cannot do this either?

If this is the case, how do I calculate the original integral?

Can I simplify $\int dθ\, q(θ)θ^2$?

Edit1

About @Ricky's comment "how can a random variable follow two distributions", I think the intended meaning of the question is being an input of a function. The two random variables may be $P$ and $Q$, $P\sim N(0,1)$ and $Q\sim (μ,σ^2)$

Edit2

About @Joako's comment, I don't think () and $+^2⋅()$ are the same, because:

$$ q(θ) = \frac{1}{\sqrt{2\pi\sigma^2}} e^{\frac{-(\theta-\mu)^2}{2\sigma^2}}\\ $$

$$ p(θ) = \frac{1}{\sqrt{2\pi}} e^{\frac{-\theta^2}{2}}\\ $$

@JimB suggested $∫()^2 d=^2+^2$, why?

If I apply the integration by parts formula: $$ ∫u \cdot dv = uv - ∫v \cdot du\\ u = θ^2\\ dv = q(θ) dθ\\ du = 2θ dθ\\ v = ∫q(θ) dθ = 1 $$

so

$ ∫θ^2q(θ)dθ\\ =θ^2-∫2θ dθ\\ =θ^2-(θ^2+C)\\ =C $

?

Mzq
  • 252
  • 1
    How can $\theta$ simultaneously be $N(0,1)$ and $N( μ,σ^2)$? – Ricky Sep 10 '23 at 14:21
  • $\int q(\theta) (\theta-0)^2=\int q(\theta) \theta^2=\sigma^2+\mu^2$ – JimB Sep 10 '23 at 15:11
  • why $∫()^2=^2+^2$? – Mzq Sep 10 '23 at 17:29
  • 1
    The variance is always the second central moment, if defined, in which case it's the second raw moment iff the mean is $0$. – J.G. Sep 10 '23 at 18:26
  • Are you aware of the formula?$$\text{Var}(x)=E[x^2]-(E[x])^2$$ – Joako Sep 11 '23 at 06:08
  • 1
    @Joako yes, I have changed my question a bit, please see the edit – Mzq Sep 11 '23 at 06:17
  • Let me see if I am following your notation: Does the definitions of $p(\theta)$ and $q(\theta)$ implies that $q(\theta)=\mu+\sigma^2\cdot p(\theta)$??... If this is right, How you express the value $p(y|\theta,\ x)$ in terms of $p(y)$? – Joako Sep 12 '23 at 06:21
  • 1
    @Joako I don't think it can imply that, see edit 2 – Mzq Sep 12 '23 at 07:52
  • I got confused indeed with the notation since $p(y|\theta,x)$ looks like a conditional probability: Could you explicitely expand what this term means? Then the variable $y$ is going to be a "constant" when integrating respect to $\theta$, right? – Joako Sep 12 '23 at 18:47

1 Answers1

0

\begin{align} F(q(θ)) &= \int dθ\, q(θ) \left[\log q(θ) - \log p(θ) - \log p(y|θ,x)\right]\\ &= \int dθ\, q(\theta)\left[\log \left(\frac{1}{\sqrt{2\pi\sigma^2}} e^{\frac{-(\theta-\mu)^2}{2\sigma^2}}\right) - \log \left(\frac{1}{\sqrt{2\pi}} e^{\frac{-\theta^2}{2}}\right) - \log \left(\frac{1}{\sqrt{2\pi\sigma_n^2}} e^{\frac{-(y-\theta x)^2}{2\sigma_n^2}}\right)\right]\\ &= \mathbb{E}_{\theta\sim q(\theta)} \left\{\log \sqrt{\frac{2\pi\sigma_n^2}{\sigma^2}}-\frac{(\theta-\mu)^2}{2\sigma^2} + \frac{\theta^2}{2} + \frac{(y-\theta x)^2}{2\sigma_n^2}\right\}\\ &= \frac{1}{2}\log \frac{2\pi\sigma_n^2}{\sigma^2}-\frac{1}{2\sigma^2}\mathbb{E}_{\theta\sim q(\theta)} \{(\theta-\mu)^2\} + \frac{1}{2} \mathbb{E}_{\theta\sim q(\theta)} \{\theta^2\} +\frac{1}{2\sigma_n^2} \mathbb{E}_{\theta\sim q(\theta)} \{(y-\theta x)^2\} \end{align} Now, can you compute the expectations using the distribution $q(\cdot)$?

Explorer
  • 3,147