2

assume $X$ and $Y$ are independent exponentially distributed RVs ($X, Y$ both $\geq 0$). in deriving $Pr(X > Y)$ i got stuck on the integral.

if $X > Y$ that means $X$ can take on any value $x \in [0, \infty]$ and $y \in [0, x]$, so we integrate the joint pdf with respect to $x$ from 0 to $\infty$ and with respect to $y$ from $0$ to $x$ - right?

$\begin{array} Pr(X > Y) &= \displaystyle\int_{0}^{\infty}\displaystyle\int_{0}^{x}P(X = x, Y = y)dxdy\\ &= \displaystyle\int_{0}^{\infty}\displaystyle\int_{0}^{x}\lambda\mu e^{-\lambda x}e^{-\mu y}dxdy\\ \end{array}$

we rearrange to make it clear which integral goes with $x$ and which with $y$:

$\begin{array} Pr(X > Y) = \displaystyle\int_{0}^{\infty}\lambda e^{-\lambda x}dx\displaystyle\int_{0}^{x}\mu e^{-\mu y}dy\\ \end{array}$

the integral with respect to $x$ is from $[0, \infty]$ of the exponential pdf which is $1$ by definition but that would leave only $\displaystyle\int_{0}^{x}\mu e^{-\mu y}dy$ which looks wrong. what is wrong w/reasoning here?

another try:

$\begin{array} Pr(X > Y) &= \displaystyle\int_{0}^{\infty}\displaystyle\int_{0}^{x}\lambda\mu e^{-\lambda x}e^{-\mu y}dxdy\\ &= \lambda\mu\displaystyle\int_{0}^{\infty}\displaystyle\int_{0}^{x} e^{-\lambda x}e^{-\mu y}dxdy \end{array}$

the inner part could be simplified to $e^{-\lambda x - \mu y}$ but that doesnt look simpler to integrate. can try to separate $dx$ and $dy$ again:

$\begin{array} Pr(X > Y) &= \lambda\mu\displaystyle\int_{0}^{\infty}e^{-\lambda x}dx\displaystyle\int_{0}^{x} e^{-\mu y}dy\\ &= \lambda\mu\displaystyle\int_{0}^{\infty}e^{-\lambda x}dx\left[\displaystyle\frac{-e^{-\mu y}}{\mu}\right]\big|^{x}_{0}\\ &= \lambda\mu\displaystyle\int_{0}^{\infty}e^{-\lambda x}dx\left( \frac{-e^{-\mu x}}{\mu} - \frac{-e^{-\mu 0}}{\mu}\right)\\ &= \lambda\mu\displaystyle\int_{0}^{\infty}e^{-\lambda x}dx\left( \frac{-e^{-\mu x}}{\mu} + 1\right) \end{array}$

this is where im stuck. expected the CDF to of an exp to appear somewhere to make integration simpler. what went wrong in the reasoning/steps?

update: i would like to understand what is wrong with my reasoning which that post that is suggested as duplicate does not do. second, that post doesn't give a formal detailed derivation

jll
  • 157

2 Answers2

5

You can try slightly easier approach with conditional probability (actually, the law of total probability). Let $X \sim \mathcal{E}xp(\lambda)$ and $Y \sim \mathcal{E}xp(\mu)$, thus \begin{align} P(X>Y) &= \int_{0}^{\infty} P(X>y|Y=y)f_Y(y)dy\\ &=\int_{0}^{\infty} e^{-\lambda y}\mu e^{-\mu y}dy\\ &= \mu\int_{0}^{\infty} e^{-y (\lambda + \mu)}dy\\ &=\frac{\mu}{\lambda + \mu}. \end{align} This is also called an exponential race, i.e., we computed the probability that $Y$ came before $X$, which equals the proportion of its pace w.r.t. other exponential variables' paces of decay.

V. Vancak
  • 16,444
  • how do you get the first term in the product $P(X > y | Y = y)$? i would assume this is $1 - P(X \leq y | Y = y)$, which is $1 - (1 - e^{-\lambda y}) = e^{-\lambda y}$ is that right? also how did you actually do the integral $e^{\lambda y}\mu e^{\mu y}dy$? – jll Jul 14 '17 at 22:30
  • I've added one more step for the integration. Regarding the first term, it is just the law of total probability,$ P(A) = \int P(A|X)dF_X(x)$ – V. Vancak Jul 14 '17 at 22:40
  • still unclear about last step. isn't $\int e^{-y(\lambda + \mu)} = -\frac{e^{-y(\lambda+\mu)}}{\lambda + \mu}$? not sure how you got 1 in numerator – jll Jul 14 '17 at 22:45
  • 2
    $ \lim_{y \to \infty } e^{-y} - e^{-0} = 0 -1 = -1. $ – V. Vancak Jul 14 '17 at 22:50
3

Your problem lies in the step,

$\begin{array} Pr(X > Y) &= \lambda\mu\displaystyle\int_{0}^{\infty}e^{-\lambda x}dx\displaystyle\int_{0}^{x} e^{-\mu y}dy \end{array}$

You must integrate with respect to $y$ inside the integral of $x$:

$\begin{array} Pr(X > Y) &= \lambda\mu\displaystyle\int_{0}^{\infty}e^{-\lambda x}\left(\displaystyle\int_{0}^{x} e^{-\mu y}dy\right)dx\\ &= \lambda\mu\displaystyle\int_{0}^{\infty}e^{-\lambda x}\left( \frac{-e^{-\mu x}}{\mu} + \frac{1}{\mu}\right)dx \end{array}$