2

I'm working on a problem from Klenke's book on probability theory:

I want to prove that if $X$ and $Y$ are independent exponentially distributed random variables with parameters $\theta$ and $\rho$ respectively, then $$ \mathrm{P}\left(X < Y\right) = {\theta \over \theta + \rho} $$ I'm having trouble proving this; I want to avoid using conditional probability because Klenke introduces this problem way before conditional probability. I'm guessing the densities of $X$ or $Y$ may come into play but I'm not sure how.

Any suggestions ?.

Felix Marin
  • 89,464
D Ford
  • 3,977
  • 1
    I think this will help https://math.stackexchange.com/questions/1332413/comparing-two-exponential-random-variables?rq=1 – Coniferous Nov 15 '17 at 20:17
  • "Parameters $\theta$ and $\rho$" is ambiguous. Sometimes the parametrization is $$ e^{-x/\theta}\left( \frac{dx}\theta\right) \text{ for } x\ge0 $$ and sometimes it is $$ e^{-\theta x} \big( \theta,dx\big) \text{ for } x\ge0. $$ Which one is intended here? – Michael Hardy Nov 16 '17 at 02:05

3 Answers3

2

$\newcommand{\bbx}[1]{\,\bbox[15px,border:1px groove navy]{\displaystyle{#1}}\,} \newcommand{\braces}[1]{\left\lbrace\,{#1}\,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\,{#1}\,\right\rbrack} \newcommand{\dd}{\mathrm{d}} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,\mathrm{e}^{#1}\,} \newcommand{\ic}{\mathrm{i}} \newcommand{\mc}[1]{\mathcal{#1}} \newcommand{\mrm}[1]{\mathrm{#1}} \newcommand{\pars}[1]{\left(\,{#1}\,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,{#2}\,}\,} \newcommand{\totald}[3][]{\frac{\mathrm{d}^{#1} #2}{\mathrm{d} #3^{#1}}} \newcommand{\verts}[1]{\left\vert\,{#1}\,\right\vert}$ \begin{align} &\int_{0}^{\infty}\int_{0}^{\infty} \pars{\theta\expo{-\theta x}}\pars{\rho\expo{-\rho y}}\bracks{x < y} \,\dd x\,\dd y = \theta\rho\int_{0}^{\infty}\expo{-\rho y}\ \overbrace{\int_{0}^{y} \expo{-\theta x}\,\dd x}^{\ds{\expo{-\theta y} - 1 \over -\theta}}\ \,\dd y \\[5mm] = &\ \rho\int_{0}^{\infty}\bracks{\expo{-\rho y} - \expo{-\pars{\rho + \theta}y}} \dd y = 1 - {\rho \over \rho + \theta} = \bbx{\theta \over \rho + \theta} \end{align}

Felix Marin
  • 89,464
1

Hint

Use the fact that for every borel measurable set $B\subset \mathbb{R}^2$ we have: \begin{align} \mathbb{P}((X,Y)\in B) = \iint_B f_{X, Y} (x, y) \, dx\,dy \end{align} Where $f_{X, Y} (x, y)$ is the joint density of $(X, Y) $. But we know $X$ and $Y$ are independent, so $f_{X, Y} (x, y)=f_X(x) f_Y(y) $.

You must choose $B$ such that: \begin{align} (X, Y) \in B \Longleftrightarrow X<Y \end{align} I'll leave the choice of $B$ and the integration that comes after that for you.

Shashi
  • 8,738
1

\begin{align} \Pr(X<Y) & = \int_0^\infty \left( \int_x^\infty \rho\theta e^{-\theta x} e^{-\rho y} \, dy \right) \,dx = \int_0^\infty \left( \theta e^{-\theta x} e^{-\rho x} \right) \, dx \\[10pt] & = \theta \int_0^\infty e^{-(\theta+\rho)x} \,dx = \frac\theta{\theta+\rho}. \end{align} This is a bit simpler than evaluating the iteratted integral in the opposite order.