5

$X,Y$ are independent random variables with common PDF $f(x) = e^{-x}$ then density of $X-Y = \text{?}$

I thought of this let $ Y_1 = X + Y$, $Y_2 = \frac{X-Y}{X+Y}$, solving which gives me $X = \frac{Y_1(1 + Y_2)}{2}$, $Y = \frac{Y_1-Y_2}{2}$

then I calculated the Jacobian $J = \begin{bmatrix} \frac{1+y_2}{2} & \frac{y_1}{2} \\ \frac{1}{2} & -\frac{1}{2} \end{bmatrix}$ so that $\left|\det(J)\right| = \frac{1+y_1+y_2}{4}$

and the joint density of $Y_1,Y_2$ is the following $W(Y_1,Y_2) = \left|\det(J)\right| e^{-(y_1+y_2)}$ when $y_1,y_2> 0$ and $0$ otherwise.

Next I thought of recovering $X-Y$ as the marginal but I got stuck. I think i messed up in the variables.

Any help is great!.

StubbornAtom
  • 17,052
BAYMAX
  • 4,972
  • 1
    why did you take such $Y_1,Y_2$? – MAN-MADE Sep 11 '17 at 17:21
  • 1
    There are lots of approaches. One is characteristic functions. Another is to note the distribution of the difference should be is symmetric about $0$ and if you look at the right hand half then memorylessness suggests the distribution of $X-Y$ given $X \gt Y$ should have the same exponential distribution as you started with – Henry Sep 11 '17 at 17:22
  • @MANMAID I found a similar technique used in an example of Rohatgi Probability and statistics book. – BAYMAX Sep 11 '17 at 17:24
  • I guess your $W(Y_1,Y_2)$ is the joint pdf of $Y_1,Y_2$, and if so then it should be $|\det(J)|$ and also I think support of $Y_1,Y_2$ are dependent. – MAN-MADE Sep 11 '17 at 17:27
  • thanks! typo there. edited.Also i thought of taking $Y_{1} =X-Y$ and hence I could have obtained the marginal but then what would be the limits of $y_{1},y_{2}$ during the integration? – BAYMAX Sep 11 '17 at 17:30
  • I've posted an answer that does not deal with integrals of functions of more than one variable. If I get inspired I might either add something to it or post a second answer dealing with integrals of functions of two variables. – Michael Hardy Sep 11 '17 at 19:05
  • No Jacobians are needed when this is done in a way that involves integrals of functions or two variables. I've posted an answer doing it using such integrals, with no Jacobians because there are no changes of variables. Often my reason for posting an answer is that I think the way I've done it is as simple as I can make it, and I think this is the simplest one posted so far (not to be confused with the other answer I posted, involving no bivariate integrals). $\qquad$ – Michael Hardy Sep 11 '17 at 20:34
  • @MichaelHardy@MANMAID@Graham Kemp@Chappers@Žiga Sajovic It was nice seeing different approaches to the problem,I was thinking that we can have a "probability and Statistics chat room" where we can discuss in detail about Probability and Statistics,any thoughts on this ?. – BAYMAX Sep 12 '17 at 14:26
  • @BAYMAX : There is also stats.stackexchange.com . – Michael Hardy Sep 12 '17 at 16:29
  • @BAYMAX : I think when you ping more than one user, only the first one gets notified. – Michael Hardy Sep 12 '17 at 16:36
  • Yes,but I tried to notify who contributed and asked for their viewpoint about creating a chatroom dedicated to this topic,actually in my viewpoint there usually are some small doubts which one asks and is usually good in chats also there is no such room in MSE.@MichaelHardy – BAYMAX Sep 12 '17 at 17:22
  • https://math.stackexchange.com/q/115022/321264 – StubbornAtom Mar 14 '20 at 12:55

5 Answers5

3

If I understood you correctly, you have, both $X$ and $Y$ being distributed by an exponential distribution, where $\lambda$ equals one. Now you want to know about the distribution of their difference, namely $Z=X-Y$. Their mass is $$P(z\ge Z)=P(z\ge X-Y)=P(z)$$ which is (for $z\le 0$) $$P(z)=\int^\infty_{0}\int^{\infty}_{x-z}e^{-x}e^{-y}\,dy\,dx,$$ as the area of interest is $y\ge x-z$. Next, we know that the density $$p(z)=\frac{d}{dz}P(z),$$ is the derivative of the mass. Using the Leibnitz rule, this is $$\frac{d}{dz}\int^\infty_{0}\int^\infty_{x-z}e^{-x}e^{-y} \, dy \, dx = \int^\infty_0 \frac{d}{dz}\int^\infty_{x-z}e^{-x}e^{-y}\,dy\,dx$$ $$\int^\infty_{-\infty} e^{-x}e^{-(x-z)} \, dx=\frac{e^z}{2}$$ After repeating the computation of $z\ge 0$, which would entail calculating $$\frac{d}{dz}P(z)=\int^\infty_0 \int^{x+z}_0 e^{-x}e^{-y} \, dy \, dx,$$ we arrive at $$p(z)=\frac{e^{-|z|}}{2}$$

Note that this known as the Laplace distribution.

3

\begin{align} \underbrace{\text{For } u>0} \text{ we have } f_{X-Y}(u) & = \frac d {du} \Pr(X-Y\le u) \\[10pt] & = \frac d {du} \operatorname{E}(\Pr(X-Y \le u \mid Y)) \\[10pt] & = \frac d {du} \operatorname{E}(\Pr(X \le u+Y\mid Y)) \\[10pt] & = \frac d {du} \operatorname{E}(1-e^{-(u+Y)}) \\[10pt] & = \frac d {du} \int_0^\infty (1 - e^{-(u+y)} ) e^{-y} \, dy \\[10pt] & = \frac d {du} \int_0^\infty (e^{-y} - e^{-u} e^{-2y}) \, dy \\[10pt] & = \frac d {du} \left( 1 - \frac 1 2 {} e^{-u} \right) \\[10pt] & = \frac 1 2 e^{-u}. \end{align} A similar thing applied when $u<0$ gives you $\dfrac 1 2 e^u,$ so you get $\dfrac 1 2 e^{-|u|}.$

But a simpler way to deal with $u<0$ is to say that since the distribution of $X-Y$ is plainly symmetric about $0$ (since $X-Y$ has the same distribution as $Y-X$), if you get $\dfrac 1 2 e^{-u}$ when $u>0,$ you have to get $\dfrac 1 2 e^u$ when $u<0.$

2

The transformation is $(X,Y)\rightarrow (Y_1,Y_2)$.

$Y_1=X+Y, Y_2=\dfrac{X-Y}{X+Y}$.

Let $y_1=x+y,y_2=\dfrac{x-y}{x+y}$, i.e., $x=\dfrac{y_1(1+y_2)}{2},y=\dfrac{y_1(1-y_2)}{2}$. Now $x>0,y>0$, hence $y_1>0, -1<y_2<1$

$J=\begin{bmatrix}\dfrac{1+y_2}{2}&\dfrac{y_1}{2}\\\dfrac{1-y_2}{2}&\dfrac{-y_1}{2}\end{bmatrix}$. Here, $\det(J)=\dfrac{-y_1}{2}$

Now $\begin{align}f_{(Y_1,Y_2)}(y_1,y_2)=|\det(J)|f_{(X,Y)}(x,y)=\dfrac{y_1e^{-y_1}}{2}I(y_1>0,-1<y_2<1)\\=y_1e^{-y_1}I(y_1>0)\cdot\dfrac{1}{2}I(-1<y_2<1)\end{align}$

Here $I(\cdot)$ is indicator function.


But I doubt you can recover the pdf of $X-Y$ easily. So, one way to do this analogous to the way you want is taking $Y_1=X-Y, Y_2=\dfrac{X+Y}{X-Y}$.


the reason Rohatgi Probability and statistics used this technique is because of independence of $X+Y,\dfrac{X-Y}{X+Y}$. But that will not work here and eventually the calculation will become very messy.

MAN-MADE
  • 5,381
  • I've upvoted this but a much simpler answer is available. I've posted two answers. One of them involves an integral of a function of two variables, but no Jacobians are needed because no changes of variables are done. So my challenge to everyone here: See if you can find a simpler way than that. – Michael Hardy Sep 11 '17 at 20:37
  • Nice! here when $Y_{2} = \frac{X+Y}{X-Y}$what is the range of $Y_{2}?$,is it $\Bbb{R} / [-1,1] ?$ – BAYMAX Sep 12 '17 at 01:35
  • @BAYMAX the reason Rohatgi Probability and statistics used this technique is because of independence of $X+Y,\dfrac{X-Y}{X+Y}$. That will not work here and eventually the calculation will become very messy. So, I suggest you to see other solutions that are posted here. – MAN-MADE Sep 12 '17 at 01:56
1

$$ P(X-Y<z) = \sum_y P(X-y<z)P(Y=y) = \int_{y \in \mathbb{R}} P(X<y+z)f(y) \, dy $$ by the law of total probability (there's probably a more rigorous way to write that middle expression, but it'll still be that integral). Then this is $$ \int_{y+z>0,y>0} (1-e^{-(y+z)})e^{-y} \, dy $$ using the given distributions. This splits into $$ \begin{cases} \int_{-z}^{\infty} (e^{-y}-e^{-2y}e^{-z}) \, dy & z<0 \\ \int_{0}^{\infty} (e^{-y}-e^{-2y}e^{-z}) \, dy & z \geq 0 \end{cases} = \begin{cases} \frac{1}{2}e^{z} & z<0 \\ 1-\frac{1}{2}e^{-z} & z\geq 0 \end{cases}. $$ Differentiating then gives the density function as $e^{-\lvert z \rvert}/2$.

Chappers
  • 67,606
  • I've up-voted this although I can see that some might object to a discrete sum for a continuous variable. I've also posted an answer with a similar approach but with greater detail. – Michael Hardy Sep 11 '17 at 19:04
1

I already posted an answer involving no integrals of functions of more than one variable; here's another approach.

\begin{align} \text{First assume } u >0. \text{ Then} \\ \Pr( X-Y > u) & = \int_0^\infty \left( \int_{y+u}^\infty f_{X,Y} (x,y) \, dx \right) \,dy \\[10pt] & = \int_0^\infty \left( \int_{y+u}^\infty e^{-x} e^{-y} \, dx \right) \,dy \\[10pt] & = \int_0^\infty \left( e^{-y} \int_{y+u}^\infty e^{-x} \, dx \right) \,dy \\ & \qquad\text{(This can be done because $e^{-y}$ does not change as $x$ goes from something to $\infty$.)} \\[10pt] & = \int_0^\infty e^{-y} \cdot e^{-(y+ u)} \, dy \\[10pt] & = \frac 1 2 e^{-u}. \end{align} That works if $u>0.$ Then use the fact that $Y-X$ has the same probability distribution as $X-Y$ to conclude that if $u<0$ then $\Pr(X-Y<u) = \frac 1 2 e^{u}.$

Therefore if $u>0$ then $\Pr(X-Y\le u) = 1- \dfrac 1 2 e^{-u}$ and mutatis mutandis if $u<0,$ so we get $\displaystyle f_{X-Y}(u) = \frac 1 2 e^{-|u|}.$