2

Suppose I have $r = [r_1, r_2, ..., r_n]$, which are iid and follow normal distribution of $N(\mu, \sigma^2)$, then I have weight vector of $h = [h_1, h_2, ...,h_n]$, which iid followed $N(0, \sigma_h^2)$, how can I calculate the $Var(\Sigma_i^nh_ir_i)$? suppose $h, r$ independent.

How should I deal with the product of two random variables, what is the formula to expand it, I am a bit confused.

Morty19
  • 43
  • 6

2 Answers2

2

First just consider the individual components, which are gaussian r.v., call them $r,h$, $$r\sim N(\mu,\sigma^2),h\sim N(0,\sigma_h^2)$$ $$ Var(rh)=\mathbb E(r^2h^2)-\mathbb E(rh)^2=\mathbb E(r^2)\mathbb E(h^2)-(\mathbb E r \mathbb Eh)^2 =\mathbb E(r^2)\mathbb E(h^2) $$ Under the given conditions, $\mathbb E(h^2)=Var(h)=\sigma_h^2$

$$ \mathbb E(r^2)=\mathbb E[\sigma^2(z+\frac \mu\sigma)^2]\\ = \sigma^2\mathbb E(z+\frac \mu\sigma)^2\\ =\sigma^2\mathbb E[z^2+2\frac \mu\sigma z+\frac {\mu^2}{\sigma^2}]\\ =\sigma^2+\mu^2 $$ $z\sim N(0,1)$ is standard gaussian random variables with unit standard deviation. Note the non-central Chi sq distribution is the sum $k $independent, normally distributed random variables with means $\mu_i$ and unit variances. Then $r^2/\sigma^2$ is such an RV.

Put it all together. $$ Var(r^Th)=nVar(r_ih_i)=n \mathbb E(r_i^2)\mathbb E(h_i^2) = n(\sigma^2 +\mu^2)\sigma_h^2 $$

If we are not too sure of the result, take a special case where $n=1,\mu=0,\sigma=\sigma_h$, then we know $$ Var(rh)=\mathbb E(r^2h^2)=\mathbb E(r^2)\mathbb E(h^2) =Var(r)Var(h)=\sigma^4 $$ which equals the result we obtained above.


I largely re-written the answer. The post that the original answer is based on is this.

Is the product of two Gaussian random variables also a Gaussian?

I found that the previous answer is wrong when $\sigma\neq \sigma_h$ since there will be a dependency between the rotated variables, which makes computation even harder. The answer above is simpler and correct.

2

The first thing to say is that if we define a new random variable $X_i$=$h_ir_i$, then each possible $X_i$,$X_j$ where $i\neq j$, will be independent.

Therefore, we are able to say

$$Var \Big(\sum_i^nX_i \Big)=\sum_i^nVar(X_i)$$

Now, since the variance of each $X_i$ will be the same (as they are iid), we are able to say

$$\sum_i^nVar(X_i)=nVar(X_1)$$

So now let's pay attention to $X_1$. We know that $h$ and $r$ are independent which allows us to conclude that

$$Var(X_1)=Var(h_1r_1)=E(h^2_1r^2_1)-E(h_1r_1)^2=E(h^2_1)E(r^2_1)-E(h_1)^2E(r_1)^2$$

(by Fubini's Theorem).

We know that $E(h_1)=0$ and so we can immediately eliminate the second term to give us

$$Var(h_1r_1)=E(h^2_1)E(r^2_1)$$

And so substituting this back into our desired value gives us

$$\sum_i^nVar(X_i)=nE(h^2_1)E(r^2_1) $$

Using the fact that $Var(A)=E(A^2)-E(A)^2$ (and that the expected value of $h_i$ is $0$), we note that for $h_1$ it follows that

$$Var(h_1)=E(h^2_1)=\sigma^2_h$$

And using the same formula for $r_1$, we observe that

$$Var(r_1)=E(r^2_1)-\mu^2=\sigma^2$$

Rearranging and substituting into our desired expression, we find that

$$\sum_i^nVar(X_i)=n\sigma^2_h (\sigma^2+\mu^2)$$

Note: the other answer provides a broader approach, however, by independence of each $r_i$ with each other, and each $h_i$ with each other, and each $r_i$ with each $h_i$, the problem simplifies down quite a lot.

FD_bfa
  • 3,989
  • 1
    $Var(h_1r_1)=E(h^2_1)E(r^2_1)=E(h_1)E(h_1)E(r_1)E(r_1)=0$ this line is incorrect... $r_i$ and itself is not independent so cannot be separated – Binxu Wang 王彬旭 May 16 '22 at 01:29
  • Even from intuition, the final answer doesn't make sense ... $Var(h_iv_i)$ cannot be $0$ right? – Binxu Wang 王彬旭 May 16 '22 at 01:42
  • Thanks for the answer, but as Wang points out, it seems to be broken at the $Var(h_1,r_1) = 0$, and the variance equals 0 which does not make sense. – Morty19 May 16 '22 at 02:02
  • 1
    @BinxuWang王彬旭 thanks for the answer, since $E(h_1^2)$ is just the variance of $h$, note that $Eh = 0$, I just need to calculate $E(r_1^2)$, is there a way to do it – Morty19 May 16 '22 at 02:05
  • That gets into the non-central Chi sq distribution part. You could still use the same trick as in my answer to get it – Binxu Wang 王彬旭 May 16 '22 at 02:11
  • 1
    I used the moment generating function of normal distribution and take derivative wrt t twice and set it to zero and got it. But thanks for the answer I will check it! – Morty19 May 16 '22 at 02:22
  • Apologies, I’ve fixed the answer now Morty19 and @BinxuWang王彬旭 – FD_bfa May 16 '22 at 16:08