5

This is exercise 2.2.1 from Achim Klenke: »Probability Theory — A Comprehensive Course«

Let $X$ and $Y$ be independent random variables with $X \sim \exp_\theta$ and $Y \sim \exp_\rho$ for certain $\theta,\rho > 0$. Show that $$\mathbf{P}[X < Y] = \frac{\theta}{\theta +\rho}\, .$$

Now, in practice, this exercise is easy. $\exp_\theta$-distribution is defined as $$ \mathbf{P}[X \leq x] = \int_0^x \theta e^{-\theta t} \, dt \quad \text{ for } x \geq 0\, .$$

We just have to evaluate the integral: $$\int_0^\infty \mathbf{P}[X \leq x] \cdot \rho e^{-\rho x} \, d x = \int_0^\infty \Bigl(\int_0^x \theta e^{- \theta t} \, d t \Bigr) \cdot \rho e^{-\rho x} \, d x\, ,$$

which gives $\frac{\theta}{\theta +\rho}$.

But how does one do it rigorously?

Why is the following possible: $$\mathbf{P}[X < Y] = \int_0^\infty \mathbf{P}[X \leq x]\cdot \mathbf{P}[Y = x] \, d x \\ \text{ and using } \mathbf{P}[Y = x] = \rho e^{-\rho x} \, ?$$

Convolution of real valued random variables hasn't been defined yet.

Novice
  • 4,094
Ystar
  • 2,866
  • 6
    You are not doing any convolutions. Because $X$ and $Y$ are independent, the conditional probability that $X \leq y$ given that $Y = y$ is the same as the unconditional probability $P{X \leq y}$. Thus, your calculation is a use of the law of total probability: $$P{X < Y} = \int_0^\infty P{X \leq y \mid Y = y}f_Y(y),\mathrm dy = \int_0^\infty P{X \leq y}f_Y(y),\mathrm dy$$ – Dilip Sarwate Jan 18 '15 at 04:56
  • @DilipSarwate the problem is that law of total probability or conditional probability were not introduced at this time in the book, so the integral is completely unjustified at this point of the book – Masacroso May 21 '19 at 08:16
  • https://math.stackexchange.com/q/1332413/321264 – StubbornAtom Feb 16 '20 at 14:30

2 Answers2

0

The most closest answer to this exercise that I had found, that fit to the context, is assuming that we already knows, at the time of the exercise, the theory of Lebesgue.

Then, as stated prior to the exercise of the book, we already knows that the joint distribution of two independent random variables $X$ and $Y$ with densities $f_X$ and $f_Y$ is defined by

$$\begin{align}\Pr[X\le x, Y\le y]&=F_{XY}(x,y)\\&=\int_0^x\int_0^y f_X(t)f_Y(s)\, ds\, dt\\&=\int_{[0,x]\times[0,y]}f_X(t)f_Y(s)\, d(s,t)\end{align}\tag1$$

Thus because $\Pr$ is a measure from $(1)$ we have that

$$\Pr[(X,Y)\in A]=\int_A f_X(t)f_Y(s) d(t,s),\quad A\in\mathcal L([0,\infty)^2)\tag2$$

Hence choosing $A:=\{(x,y)\in [0,\infty)^2: x<y\}$ we find that

$$\begin{align}\Pr[(X,Y)\in A]&=\int_0^\infty\int_0^t f_X(s)f_Y(t)\,ds\, dt\\ &=\int_0^\infty F_X(t) f_Y(t)\, dt\\&=\int_0^\infty(1-e^{-\theta t})\rho e^{-\rho t}dt\\ &=1-\rho\int_0^\infty e^{-(\theta+\rho)t}dt\\&=1-\frac{\rho}{\rho+\theta}=\frac{\theta}{\theta+\rho}\end{align}\tag3$$

However I dont found an answer that fit well to the theory of the book when this exercise was presented. Probably some variation of this reasoning could be justified in the context where the exercise appear.

Masacroso
  • 30,417
0

But how does one do it rigorously?

Use Corollary 2.22 (from the 3rd edition of the book). That is, argue for the existence of the joint density $f_{X, Y}(x, y)$. Then argue that \begin{align*} \mathbf P[X < Y] &= \mathbf P\left[\left\{ \omega \in \varOmega \colon X(\omega) < Y(\omega) \right\}\right]\\ &= \mathbf P\left[\left\{ \omega \in \varOmega \colon X(\omega) \leq Y(\omega), \text{ and } Y(\omega) \leq \infty \right\}\right]\\ &= F_{X, Y}(y, \infty) \end{align*}

Set up the double integral as described in Corollary 2.22, appeal to Fubini--Tonelli to put the integral in the appropriate order, and out comes the answer.

Novice
  • 4,094