7

Let $X$ and $Y$ be exponentially distributed random variables with parameter $1$ and let $U=\operatorname{min}\{X,Y\}$ and $V=\operatorname{max}\{X,Y\}$. Show that $V-U$ is independent of $U$.

We have shown that $U$ is distributed exponentially with parameter $2$.

I am surprised to find that I don't actually know how to do this. I know of no other way than to show that $\mathbb{P}(U<x,V-U<y)=\mathbb{P}(U<x)\space\mathbb{P}(V-U<y)$ and I don't think I know how to compute the left hand side.

Can we do $$\int^{\infty}_0f_V(v)\mathbb{P}(x>U>v-y)\operatorname{dv}=\int^{\infty}_0\left(\left(\int F_X(t)F_Y(t)\operatorname{dt}\right)\left(\int^x_{v-y}2e^{-2u}\operatorname{du}\right)\right)\operatorname{dv}?$$

As $F_V(v)=F_X(v)F_Y(v)$ where $F_X$ and $F_Y$ are the distribution functions of $X$ and $Y$ respectively and $\int^x_{v-y}2e^{-2u}\operatorname{du}=\mathbb{P}(v-y<U<x)$.

I think I've seen this before but I really don't think this is what I'm meant to do, is this correct in general and is there a better way in this specific case?

Any guidance would help me out a lot, thanks!

BCLC
  • 13,459
Aka_aka_aka_ak
  • 1,631
  • 10
  • 22

3 Answers3

5

Well, iff $X,Y$ are independent as well as identically exponentially distributed, then we have the following, by the Law of Total Probability:

$$\def\P{\operatorname{\mathsf P}}\begin{align}\P(U<u, V-U<w) ~=~&{ \P(X<Y,X<u,Y-X<w)+\P(Y\leqslant X,Y<u,X-Y<w)}\\[1ex] =~& \P(X<Y<w+X, X<u)+\P(Y\leqslant X<w+Y, Y< u) \\[1ex] =~& \int_0^u e^{-x}\P(x<Y<w+x\mid X=x)\operatorname d x+\int_0^u e^{-y}\P(y\leqslant X<w+y\mid Y=y)\operatorname d y\\[3ex] \overset{\text{iid}}=~& 2\int_0^u e^{-x}(e^{-x}-e^{-(w+x)})\operatorname d x\\[1ex] =~& 2(1-e^{-w})\int_0^u e^{-2x}\operatorname d x \\[1ex] =~& (1-e^{-w})(1-e^{-2u})\end{align}$$

Do similarly for $\P(U<u)$ and $\P(V-U<v)$

Graham Kemp
  • 129,094
4

This follows from properties of the Poisson process. Let two independent Poisson processes, both with rate 1, start at 0. Then $X$ is waiting time before first event in process 1, and $Y$ is waiting time before first event in process 2. If we view the two processes together, combined, it is one Poisson process with rate 2.

Then $U=\min(X,Y)$ is the waiting time before first event in the combined process, and $V-U$ is the interarrival time between first and second event in the combined process. We know that different interarrival times in a poisson process are independent, which gives the result, without any calculations.

1

This is related to a certain characterization of the Exponential distribution, namely the independence of $\min(X,Y)$ and $X-Y$ for two absolutely continuous random variables $X$ and $Y$ iff $X$ and $Y$ are independent Exponential random variables with the same location parameter. Here is the relevant article (For similar characterization of Geometric distribution, see this and this).

Joint density of $(X,Y)$ is given by $f_{X,Y}(x,y)=e^{-(x+y)}\mathbf1_{x>0,y>0}$

We transform $(X,Y)\to(X_1,X_2)$ where $X_1=\min(X,Y)$ and $X_2=X-Y$.

For each of the cases $x<y$ and $x\geqslant y$, absolute value of the Jacobian of transformation turns out to be $1$. From this we obtain the joint density of $(X_1,X_2)$, namely

$$f_{X_1,X_2}(x_1,x_2)=\begin{cases}\frac{1}{2}e^{-x_2}.2e^{-2x_1}&,\text{ if }x_2\geqslant0,x_1\geqslant0\\\frac{1}{2}e^{x_2}.2e^{-2x_1}&,\text{ if }x_2<0,x_1\geqslant0\\0&,\text{ otherwise } \end{cases}$$

$$=2e^{-2x_1}.\frac{1}{2}e^{-|x_2|}\mathbf1_{x_1\geqslant0\,,\, x_2\in\mathbb{R}}=f_{X_1}(x_1)f_{X_2}(x_2)$$

This shows the independence of $X_1\sim\text{Exp}$ with mean $1/2$ and $X_2\sim\text{Laplace}(0,1)$.

Now we transform $(X_1,X_2)\to(U,V)$ where $U=X_1$ and $V=|X_2|$

(Note that my notations are different from those in the question)

This is a $2$ to $1$ mapping as we have two preimages of $x_2$. In either case, absolute value of the Jacobian equals $1$. Hence we obtain the joint pdf of $(U,V)$ as

$$f_{U,V}(u,v)=f_{X_1,X_2}(u,v)\cdot1+f_{X_1,X_2}(u,-v)\cdot1$$

$$=2e^{-2u}\mathbf1_{u\geqslant0}\cdot e^{-v}\mathbf1_{v\geqslant0}=f_U(u)f_V(v)$$

This proves the independence of $U=\min(X,Y)$ and $V=|X-Y|=\max(X,Y)-\min(X,Y)$.


The easiest way to show this analytically I think is to use order statistics to find the joint density of $X_{(1)}=\min(X,Y)$ and $X_{(2)}=\max(X,Y)$. Then use a change of variables $(X_{(1)},X_{(2)})\to(U=X_{(1)},V=X_{(2)}-X_{(1)})$. We arrive at the result in no time.

StubbornAtom
  • 17,052