0

The random values $ξ_i, i = 1, 2$, are independent and have a standard normal distribution.

Are there independent random variables $η_1 = ξ_1 + ξ_2, η_2 = ξ_1 - ξ_2$?

I'm tried to find the answer:

$M(ξ_1+ξ_2)=M_{ξ_1}+M_{ξ_2}$

$M(ξ_1-ξ_2)=M_{ξ_1}-M_{ξ_2}$

$M{η_1η_2}=M(ξ_1 - ξ_2)(ξ_1 + ξ_2,)=M(ξ_1^2 - ξ_2^2)$

$cov(η_1,η_2)=M_{η_1η_2}-M_{η_1}M_{η_2}=M_{ξ_1^2}-M_{ξ_2^2}-(M_{ξ_1}-M_{ξ_2})=D_{ξ_1}-D_{ξ_2}$

StubbornAtom
  • 17,052

2 Answers2

2

Method of Jacobians

You need to use the method of Jacobians to find the joint density of $(\eta_1, \eta_2)$ and then use the joint density to conclude independence.

Observe that due to independence of $\xi_1, \xi_2$ that their joint density function is given by $$f_{\xi_1, \xi_2}(x, y) = f_{\xi_1}(x)f_{\xi_2}(y) = \dfrac{1}{\sqrt{2\pi}}e^{-x^2/2} \cdot \dfrac{1}{\sqrt{2\pi}}e^{-y^2/2} = \dfrac{1}{2\pi}e^{-(x^2+y^2)/2} $$ for $x, y \in \mathbb{R}$.

We now solve for $\xi_1, \xi_2$ in terms of $\eta_1, \eta_2$. Observe $\eta_1 = \xi_1 + \xi_2$, so that $\xi_1 = \eta_1 - \xi_2$, hence $$\eta_2 = \xi_1 - \xi_2 = (\eta_1 - \xi_2) - \xi_2 = \eta_1 - 2\xi_2 \implies \xi_2 = \dfrac{\eta_1 - \eta_2}{2}\text{.}$$ It follows that $$\xi_1 = \eta_1 - \left(\dfrac{\eta_1 - \eta_2}{2}\right)= \dfrac{\eta_1 + \eta_2}{2}\text{.}$$

The Jacobian is thus given by $$J = \det\left(\begin{bmatrix}\displaystyle\frac{\partial \xi_1}{\partial \eta_1} & \displaystyle\frac{\partial \xi_1}{\partial \eta_2} \\ \displaystyle\frac{\partial \xi_2}{\partial \eta_1} & \displaystyle\frac{\partial \xi_2}{\partial \eta_2} \end{bmatrix}\right) = \det\left(\begin{bmatrix}\displaystyle \dfrac{1}{2} & \displaystyle \dfrac{1}{2} \\ \displaystyle \dfrac{1}{2} & \displaystyle -\dfrac{1}{2} \end{bmatrix} \right) = \dfrac{1}{2}\left(-\dfrac{1}{2}\right) - \dfrac{1}{2}\left(\dfrac{1}{2}\right) = -2\left(\dfrac{1}{4}\right) = -\dfrac{1}{2}$$ hence the absolute value $|J| = \dfrac{1}{2}$. Thus, the joint density of $(\eta_1, \eta_2)$ is given by $$f_{\eta_1, \eta_2}(s, t) = f_{\xi_1, \xi_2}\left(\dfrac{s+t}{2}, \dfrac{s-t}{2} \right)|J| = \dfrac{1}{4\pi}e^{-[(s+t)^2 + (s-t)^2]/8}$$ for $s, t \in \mathbb{R}$. Through some algebra, one can show that $$(s+t)^2 + (s-t)^2 = 2(s^2 + t^2)$$ thus $$\begin{align} f_{\eta_1, \eta_2}(s, t) &= \dfrac{1}{4\pi}e^{-(s^2 + t^2)/4} \\ &= \dfrac{1}{\sqrt{4\pi}}e^{-s^2/4} \cdot \dfrac{1}{\sqrt{4\pi}}e^{-t^2/4} \\ &= \dfrac{1}{\sqrt{2}\cdot\sqrt{2\pi}}e^{-(s/\sqrt{2})^2/2} \cdot \dfrac{1}{\sqrt{2}\cdot\sqrt{2\pi}}e^{-(t/\sqrt{2})^2/2}\text{.} \end{align}$$ Observe that we have now separated the joint density function's two inputs ($s$, $t$), and furthermore, $$f(s) = \dfrac{1}{\sqrt{2}\cdot\sqrt{2\pi}}e^{-(s/\sqrt{2})^2/2}$$ is the probability density function of a normal random variable with mean $0$ and variance $(\sqrt{2})^2 = 2$. Thus, $$f_{\eta_1, \eta_2}(s, t) = f_{\eta_1}(s)f_{\eta_2}(t)$$ showing that $\eta_1, \eta_2$ are independent, and furthermore, they are both normally distributed with mean $0$ and variance $2$.


Matrix Algebra Method

Definition. If $\boldsymbol\xi = \begin{bmatrix}\xi_1 \\ \xi_2\end{bmatrix}$ is a random vector with independent and identically distributed standard normal random variables, $\mathbf{Y} = \mathbf{A}\boldsymbol\xi + \mathbf{b}$ for conformable matrix $\mathbf{A}$ and column vector $\mathbf{b}$ is said to have the multivariate normal distribution.

Theorem. The variance matrix of $\mathbf{Y}$ is $\text{Var}(\mathbf{Y}) = \mathbf{A}\mathbf{A}^{T}$.

Theorem. Let $\text{Var}(\mathbf{Y}) = \text{Var}\left(\begin{bmatrix}Y_{1} \\ Y_{2}\end{bmatrix}\right) = \begin{bmatrix}V_{11} & V_{12} \\ V_{21} & V_{22}\end{bmatrix}$. Then $V_{12} = V_{21} = 0$ if and only if $Y_1, Y_2$ are independent.

Solution. The matrix $\mathbf{A}$ would be given by $$\mathbf{A} = \begin{bmatrix} 1 & 1 \\ 1 & -1 \end{bmatrix}$$

so that $\mathbf{A}\mathbf{A}^{T} = \begin{bmatrix} 2 & 0 \\ 0 & 2 \end{bmatrix}$, hence $\eta_1, \eta_2$ are independent.

Clarinetist
  • 19,519
0

I think your third line has an issue and should be:

$E(n_1n_2) = E[(e_1+e_2)(e_1-e_2)] = E[e_1^2 - e_2^2]$

$e1$ and $e2$ are independent and identically distributed; so, I think this reduces to: $E[e_1^2 - e_2^2] = E[e_1^2] - E[e_2^2] = 0$.

Also, since $e1$ is standard normal, $E[e1] = 0$.

  • I think the third line is correct in OP's unusual notation, but you observe correctly that we could note at this point that the expectation of $\eta_1 \eta_2$ is zero. – Nate Eldredge Jun 02 '20 at 15:25
  • I'm sorry, but how can we prove that they are independent random variables? – D7ILeucoH Jun 02 '20 at 15:40