1

Let $X_i$ be independent normally distributed random variables with zero mean and variance $\sigma^2 \neq 1$. What is the probability density function of the random variable formed by the sum of their squares?

Here is my attempt: Let $Y = \sum_{i=1}^{k}X_i^2$. Then $$\begin{align}Y &= \sigma^2\sum_{i=1}^{k}(\frac{X_i}{\sigma^2})^2\\ &\sim \sigma^2\chi^2_{k}\\ &\sim \sigma^2\Gamma_{k/2}(\theta = 2)\\ &\sim \Gamma_{k/2}(\theta = 2\sigma^2) \end{align} $$

where $\chi^2_{k}$ is the central chi-squared distribution, $\Gamma_{k}(\theta)$ is the Gamma distribution with Scaling parameter $\theta$.

Therefore, the mean of $Y$ is $\frac{k}{2}2\sigma^2 = k \sigma^2$ and variance $\frac{k}{2}(2\sigma^2)^2 = 2k\sigma^4$.

Also, is it possible to derive the mean and variance without going back to the Gamma distribution?

Weaam
  • 2,849

1 Answers1

2

Since $Y$ is the sum of $k$ i.i.d. random variables distributed like $Z$, $\mathrm E(Y)=k\cdot\mathrm E(Z)$ and $\mathrm{Var}(Y)=k\cdot\mathrm{Var}(Z)$. Here, $Z$ is distributed like $\sigma^2 X_0^2$ where $X_0$ is standard normal hence $\mathrm E(Z)=\mathrm E(X_0^2)\cdot\sigma^2=\sigma^2$ and $\mathrm{Var}(Z)=\left(\mathrm E(X_1^4)-\mathrm E(X_1^2)^2\right)\cdot\sigma^4=(3-1)\cdot\sigma^4=2\sigma^4$.

To prove the first assertions, since $Y=Z_1+\cdots+Z_k$, one can use the linearity of the expectation for $\mathrm E(Y)$, and, for $\mathrm{Var}(Y)$, the expansion $$ \mathrm E(Y^2)=\sum_{i=1}^k\mathrm E(Z_i^2)+\sum_{i\ne j}\mathrm E(Z_i)\cdot\mathrm E(Z_j)=k\cdot\mathrm E(Z^2)+k(k-1)\cdot\mathrm E(Z)^2, $$ which yields $$ \mathrm{Var}(Y)=\mathrm E(Y^2)-\mathrm E(Y)^2=\mathrm E(Y^2)-k^2\cdot\mathrm E(Z)^2=k\cdot\mathrm{Var}(Z). $$

Did
  • 279,727