3

Let $X_1$,$X_2$, $X_3$ denote a random sample from the distribution having p.d.f $$f(x) = e^{-x},\,\, 0<x <\infty,$$ zero elsewhere. Show that $$y_1 = \frac{x_1}{x_1 + x_2} $$ $$y_2 = \frac{x_1 +x_2}{x_1 + x_2 +x_3} $$

$$y_3 = {x_1 + x_2 +x_3} $$

are mutually stochastically indepedent.

According to the definition of continuous independent random variables, the joint pdf of these random variables is equal to the product marginal pdf of these variables

The joint can be can be calculated using the Jacobian, but first I have to solve the linear system of equations The answer is $$x_1 = {y_1y_2y_3} $$ $$x_2 = {y_2y_3}-{y_1y_2y_3} $$ $$x_3 = {y_3}-{y_2y_3} $$

and the determinant of the Jacobian is

$${y_2}{y_3^2} $$

How do I obtain the joint pdf of these functions? How do I obtain the marginal of each variable; which limits should I take to integrate?

daOnlyBG
  • 2,711
rommel
  • 33
  • So, your question is how to apply the Jacobian method to this change of variables? 1. What is missing from the method in your answer? 2. Do you have a source on this method? – Did Oct 09 '15 at 05:46
  • Yes I have this, do you know another one? http://www2.econ.iastate.edu/classes/econ671/hallam/documents/Transformations.pdf – rommel Oct 09 '15 at 14:50
  • Quote: "1. What is missing from the method in your answer?" – Did Oct 09 '15 at 21:21
  • I do not know what the determinant mean, is that the joint pdf? and I do not know how compute the marginal pdf of the random variables, I need to show that are independent, and the best I could do was to follow the pdf link – rommel Oct 09 '15 at 21:28
  • is there and easier method to show that are indepent @Did? – rommel Oct 09 '15 at 21:32
  • 1
    @rommel Both in the example following theorem 4 and in the answer I posted it is shown how to compute both the Jacobian determinant and the marginals. Not sure what exactly do you want to know about the Jacobian, it is a factor used when you make a change of variables. Here a general explanation and here one simpler. I have tried to solve the problem by using the joint CDF but without success until know, so no sure if there is an easier way to solve it. – Carlos H. Mendoza-Cardenas Oct 10 '15 at 02:53
  • How do you know the limits of integration? or how do you get it? @CarlosMendoza – rommel Oct 10 '15 at 04:40
  • @rommel I added to my answer an explanatory note about the limits. – Carlos H. Mendoza-Cardenas Oct 10 '15 at 14:49
  • @CarlosMendoza thanks – rommel Oct 11 '15 at 00:34
  • @CarlosMendoza can yo recommend me a book about statistics from scratc?, I think I lack some foundations, and you can see I am not very good at math – rommel Oct 11 '15 at 00:54
  • @rommel For good foundations in probability I would recommend this course from MIT, it is full of study material. There is also a great edX version of it. I also recommend the companion book for foundations in probability. – Carlos H. Mendoza-Cardenas Oct 11 '15 at 01:11
  • @CarlosMendoza ok thank you I will check it – rommel Oct 11 '15 at 03:35
  • The first part of the answer is given in this post. – EditPiAf Sep 15 '17 at 15:51

1 Answers1

2

Using theorem 4 in the source you posted,

$$f_{Y_1Y_2Y_3}(y_1,y_2,y_3) = f_{X_1X_2X_3}(x_1,x_2,x_3)\lvert J \rvert$$

where

$$ J = \begin{vmatrix} \frac{\partial x_1}{\partial y_1} & \frac{\partial x_1}{\partial y_2} & \frac{\partial x_1}{\partial y_3}\\ \frac{\partial x_2}{\partial y_1} & \frac{\partial x_2}{\partial y_2} & \frac{\partial x_2}{\partial y_3}\\ \frac{\partial x_3}{\partial y_1} & \frac{\partial x_3}{\partial y_2} & \frac{\partial x_3}{\partial y_3} \end{vmatrix} $$

Asumming that $X_1$, $X_2$ and $X_3$ are independent,

$$f_{X_1X_2X_3}(x_1,x_2,x_3) = e^{-(x_1+x_2+x_3)} = e^{-y_3} \qquad \text{for }y_3>0$$

and as you said, $\lvert J \rvert = y_2y_3^2$. Therefore,

$$f_{Y_1Y_2Y_3}(y_1,y_2,y_3) = y_2y_3^2e^{-y_3} \qquad \text{for }y_3>0, 0\lt y_1,y_2 \lt 1$$

To better interpret this result, we can write

$$f_{Y_1Y_2Y_3}(y_1,y_2,y_3) = f_{Y_2Y_3\mid Y_1}(y_2,y_3\mid y_1)f_{Y_1}(y_1) $$

And from this we can see that

$$f_{Y_2Y_3\mid Y_1}(y_2,y_3\mid y_1) = f_{Y_2Y_3}(y_2,y_3) = y_2y_3^2e^{-y_3} \qquad \text{for }y_3>0, 0\lt y_2 \lt 1$$

and

$$f_{Y_1}(y_1) = 1\qquad \text{for }0\lt y_1 \lt 1$$

The marginal of $Y_3$ can be calculated by integrating $f_{Y_2Y_3}(y_2,y_3)$ over $y_2$,

$$f_{Y_3}(y_3) = \int_0^1y_2y_3^2e^{-y_3}dy_2 = \frac{1}{2}y_3^2e^{-y_3}\qquad \text{for }y_3>0$$

In the same way,

$$f_{Y_3}(y_3) = \int_0^\infty y_2y_3^2e^{-y_3}dy_3 = 2y_2\qquad \text{for }0\lt y_2 \lt 1$$


Informal note about the limits of integration:

For $Y_1$: By definition,

$$y_1 = \frac{x_1}{x_1 + x_2}.$$

Since $x_1,x_2 \gt 0$, $y_1 \gt 0$. In addition, the denominator is always greater than the numerator, then $y_1 \lt 1$, which give us $0 \lt y_1 \lt 1$.

For $Y_2$: Same argument than for $Y_1$

For $Y_3$: $y_3 \gt 0$ since it is the sum of positive numbers.