1

Show that $X_1X_2/\sqrt{X_1^2+X_2^2}$ is normally distributed. Where $X_1\sim N(0,\sigma_1^2),X_2\sim N(0,\sigma_2^2)$. Try to use jacobian transformation$$ U=X_1X_2/\sqrt{X_1^2+X_2^2}$$ $$V=X_1$$but I failed to solve the integration.

Edit, $X_1, X_2$ are independent.

Edit2, my effort so far

I use $ U=X_1X_2/\sqrt{X_1^2+X_2^2}$, $V=X_1$ to do the transformation and I solve for $X_1=V,X_2=\frac{UV}{\sqrt{V^2-U^2}}$.

Then I compute $J=\begin{vmatrix} \frac{\partial X_1}{\partial U}&\frac{\partial X_1}{\partial V}\\ \frac{\partial X_2}{\partial U}&\frac{\partial X_2}{\partial V} \end{vmatrix}=\begin{vmatrix}0&1\\-\frac{V^3}{(V^2-U^2)^{\frac{3}{2}}}&\frac{\partial X_2}{\partial V}\\ \end{vmatrix}=\frac{V^3}{(V^2-U^2)^{\frac{3}{2}}}$

Substituting $|J$| and $U,V$ into $f_{X_1,X_2}(x_1,x_2)=f_{X_1}*f_{X_2}$ and integrate with variable $V$

$$f_U(u)=\int_{-\infty}^{+\infty}\frac{1}{2\pi\sigma_1\sigma_2}exp\left(-\frac{v^2}{2\sigma_1^2}-\frac{u^2v^2}{2\sigma_2^2(v^2-u^2)}\right)*|J|dv=\int_{-\infty}^{+\infty}\frac{1}{2\pi\sigma_1\sigma_2}exp\left(-\frac{v^2}{2\sigma_1^2}-\frac{u^2v^2}{2\sigma_2^2(v^2-u^2)}\right)\frac{v^3}{(v^2-u^2)^{\frac{3}{2}}}dv$$

It seems that the integration will get value $0$, since it's symmetric in $v$.

And, I apologize for not showing my effort when I ask a question. I know it's basic manner to do it but I was lazy.

Rowan
  • 992
  • 2
    Sure that $\sigma_1^2\ne\sigma_2^2$? Sure that $(X_1,X_2)$ is not jointly normally distributed? Please show "the integration" that you "failed to solve". – Did Nov 19 '16 at 11:46
  • Relevant:https://stats.stackexchange.com/questions/333295/if-x-and-y-are-independent-normal-variables-each-with-mean-zero-then-frac/. – StubbornAtom Mar 31 '18 at 21:30
  • @Did But surely $\sigma_1^2$ need not be equal to $\sigma_2^2$ for the result to hold. – StubbornAtom Apr 03 '18 at 20:05
  • @StubbornAtom But surely some crucial hypothesis were missing from the question at the time. – Did Apr 04 '18 at 06:35
  • @Did I finally solved this question by using the method mentioned in Edit2, but the transformation I made was not one to one mapping. So I find a one to one mapping transformation and got the integral which can be find in https://math.stackexchange.com/q/2023685/229922 – Rowan Apr 04 '18 at 06:52
  • @Did Maybe I was using some hypothesis without knowing it myself. I know how to use Jacobian matrix to solve this kind of question but never understand it throughly. – Rowan Apr 04 '18 at 07:00
  • Independence. $ $ – Did Apr 04 '18 at 07:00
  • @Did yes, X1 and X2 are independent. I mentioned it in Edit1 ;) – Rowan Apr 04 '18 at 07:02
  • I know. And the hypothesis is crucial. – Did Apr 04 '18 at 07:04
  • @Rowan so what transformation did you finally use? – StubbornAtom May 02 '18 at 05:33
  • @StubbornAtom I’m not sure the exact form, but I remember it is in fraction form. But you can check my transform above is not 1-1 mapping by some values of X1 and X2, for example,both (0,1) and (0,-1) will give you the same pair of u,v. I’ll tell you what exact form I used when I come home where I have a copy of my solution. – Rowan May 02 '18 at 06:43
  • @Rowan Yes that would be nice. But I don't see what the problem is if your mapping is not one-one. The same method still applies, with a more general setting. – StubbornAtom May 02 '18 at 10:07
  • @StubbornAtom As far as I know, One requirement of using this method is one-one mapping between original vectors and the transformed ones. See https://www.statlect.com/fundamentals-of-probability/functions-of-random-vectors If you know there is a way to relax this requirement, I’d like to know;) – Rowan May 02 '18 at 10:23
  • @Rowan That link only has examples of one-one maps. But the general transformation formula allows many-to-one mappings where we partition the support of the transformed variables into disjoint sets in which the maps are individually one-one. Like here : https://math.stackexchange.com/a/2647462/321264. – StubbornAtom May 02 '18 at 10:30
  • @StubbornAtom I read your answer and you’re right. If I can part the support correctly and do the calculation I should get the correct answer. Thanks – Rowan May 02 '18 at 10:59
  • @StubbornAtom In case you’re still interested,I use V=X1/X2 and U is the same as in the question. And I like your answer(especially the second one) better than mine. Thanks for sharing – Rowan May 03 '18 at 13:59
  • @Rowan This question does not work like the other answer, here the variances are supposed to be unequal, which makes it a tad difficult. – StubbornAtom May 03 '18 at 16:21

1 Answers1

1

Comment, for intuition and maybe some clues:

I simulated this a million times with with $X_1, X_1 \stackrel{iid}{\sim} Norm(0, 1).$ Then for $Y = X_1X_2/\sqrt{X_1^2 + X_2^2},$ I got $E(Y) \approx 0,$ $SD(Y) \approx 0.5.$ A Shapiro-Wilk test on the first one thousand values of $Y$ failed to reject normality. The histogram of the simulated distribution of $Y$ with the best-fitting normal density function is shown at left below. In this case, $D = X_1^2 + X_2^2 \sim Chisq(df=2)$.

In a second simulation with $X_1 \sim Norm(0,1)$ and independently $X_2 \sim Norm(\mu = 0, \sigma=4),$ I got $E(Y) \approx 0,$ $SD(Y) \approx 0.8.$ A Shapiro-Wilk test on the first one thousand values of $Y$ failed to reject normality. The histogram of the simulated distribution of $Y$ with the best-fitting normal density function is shown at right below.

enter image description here

Perhaps more help if you answer @Did's questions, and show us soon what you tried. What is your Jacobian?

BruceET
  • 51,500
  • Thanks for the intuition.By the simulation result I try to guess $Y\sim N(0,\sigma_1\sigma_2/(\sigma_1+\sigma_2))$. And it turns out to be true. – Rowan Nov 22 '16 at 14:14
  • Congratulations! And I'm fixing a small typo, that seems not to have distracted you. – BruceET Nov 22 '16 at 22:50