1

Let $X$ and $Y$ be two random variables with respective PDFs $P_X(x)$ and $P_Y(y)$ given by

\begin{align}\label{Eq_Ap_2} P_{X}(x)&=M\beta e^{-\beta x}\begin{pmatrix} 1-e^{-\beta x} \end{pmatrix}^{M-1}\\ &=M\beta \sum_{m=0}^{N-1}(-1)^n\binom{M-1}{m}\begin{pmatrix} e^{-\beta x} \end{pmatrix}^{m+1} \end{align}

\begin{align} P_{Y}(y)&=N\beta e^{-\beta y}\begin{pmatrix} 1-e^{-\beta y} \end{pmatrix}^{N-1}\\ &=N\beta \sum_{n=0}^{N-1}(-1)^n\binom{N-1}{n}\begin{pmatrix} e^{-\beta y} \end{pmatrix}^{n+1}.\label{Eq_Ap_3} \end{align}

What is the PDF of $$ Z=\min\{X,Y\} $$

Mokhtar
  • 99

3 Answers3

2

You did not state it, but I assume X, Y are independent

Firstly find the CDF of $Z = \min(X,Y)$ and then differentiate it w.r.t. z to get its PDF: $\mathbb{P}(Z \leq z) = 1- \mathbb{P}(\min(X,Y) > z) = 1 - \mathbb{P}(X > z)\mathbb{P}(Y > z) = 1 - (1 - \mathbb{P}(X \leq z))(1 - \mathbb{P}(Y \leq z)) = \mathbb{P}(Y \leq z) + \mathbb{P}(X \leq z) - \mathbb{P}(Y \leq z)\mathbb{P}(X \leq z) $

Taking the derivative w.r.t. z yields: $P_Z(z) = P_Y(z) + P_X(z) - P_Y(z)\mathbb{P}(X \leq z) - \mathbb{P}(Y \leq z)P_X(z) = P_X(z)\mathbb{P}(Y \geq z) + P_Y(z)\mathbb{P}(X \geq z) = P_X(z)\int_z^{\infty}P_Y(t)dt + P_Y(z)\int_z^{\infty}P_X(t)dt$

Substituting your PDFs and computing the integrals yields the final formula. Since the sum is finite you can interchange the integration and summation. Computing the integrals is not hard.

  • hi @Mateusz Eggink what is mean $\int_{z}^{\infty}P_Y(t)dt$ dose it mean $CDF_Y(\infty)-CDF_Y(z)$ – Mokhtar Apr 28 '18 at 21:04
  • If you want to express $\int_z^{\infty}P_Y(t)dt$ using the CDF of Y then what you have written is correct. However, remember that $CDF_Y(\infty) = 1$.

    Here I specifically expressed the probability $\mathbb{P}(Y \geq z)$ using the PDF of Y because the CDF is not given in the exercise.

    – StatsyBanksy Apr 28 '18 at 21:14
  • ok we can say $CDF_Y(z)=1-P(Y\geq z)$ and thank you – Mokhtar Apr 28 '18 at 21:54
1

Assume X and Y are independent. $P(Z\gt z)=P(X\gt z)P(Y\gt z)$. Thus $P(Z\lt z)=1-(1-P(X\lt z))(1-P(Y\lt z))=P(X\lt z)+P(Y\lt z)-P(X\lt z)P(Y\lt z).$

Therefore $P_Z(z)=P_X(z)+P_Y(z)-(CDF_X(z)P_Y(z)+CDF_Y(z)P_X(z))$.

Note to comment - I was subconsciously using the formula for max rather than min before.

1

The independence between $X$ and $Y$ will be assumed, otherwise there's no way to proceed.

TL;DR : see Eq.\eqref{Eq05} for the end result and an intuitive interpretation that follows.

Note that $X$ can be viewed as the maximum of $M$ exponential random variables that are i.i.d. with rate of $\beta$. Denote this set as $U_i$ for $i = 1$ to $M$. All $U_i$ have the same density $f_U(t) = \beta\, \mathrm{Exp}(-\beta t)$.

Similarly, $Y$ has the same distribution as the maximum of another set of $N$ exponential i.i.d with rate $\beta$. Call these $W_k$ for $k = 1$ to $N$, and their common density is the same $f_W(t) = \beta\, \mathrm{Exp}(-\beta t)$.

The two sets $U_i$ and $W_k$ are independent, otherwise $X$ and $Y$ cannot be independent.

Denote the cumulative function as $F_X(t) \equiv \Pr\{ X < t\}$, not to be confused with the density $P_X(x)$. The same notation goes for $F_U(t)$, $F_W(t)$, and $F_Y(t)$. We are given \begin{align*} F_X(t) &= \bigl( F_U(t) \bigr)^M = \left( 1 - e^{-\beta t} \right)^M & &\text{and} & F_Y(t) &= \bigl( F_W(t) \bigr)^N = \left( 1 - e^{-\beta t} \right)^N \end{align*}

There is actually an intuitive shortcut to obtain the density of $Z = \min\{X, Y\}$, as is often the case with order statistics. Nonetheless, let's do it concretely from the cumulative function to appreciate some inner structure.


\begin{align*} F_Z(t) \equiv \Pr\{ Z < t\} &= \Pr\{ X~\text{is smaller} \} + \Pr\{ Y~\text{is smaller} \} \\ &= \color{blue}{ \Pr\{ X < t ~~\&~~ Y> X \}} + \Pr\{ Y < t ~~\&~~ X > Y \} \\ &= \color{blue}{ \int_{x = 0}^t \int_{y=x}^{\infty} P_X(x) P_Y(y)\,\mathrm{d}y \,\mathrm{d}x } + \int_{y = 0}^t \int_{x=y}^{\infty} P_X(x) P_Y(y)\,\mathrm{d}x \,\mathrm{d}y \tag{1} \label{Eq01} \end{align*} In the above the fact is already invoked that $X \perp Y \implies P_{XY}(x,y) = P_X(x)P_Y(y)$

Consider the first integral in Eq.\eqref{Eq01}, which is the region above the diagonal in the $X$-$Y$ plane. \begin{align*} \color{blue}{ \Pr\{ X < t ~~\&~~ Y> X \} } &= \int_{x = 0}^t \left[ P_X(x)\int_{y=x}^{\infty} P_Y(y)\,\mathrm{d}y \right] \,\mathrm{d}x \\ &= \int_{x = 0}^t \Bigl[ P_X(x) \bigl( 1 - F_Y(x) \bigr)\Bigr] \,\mathrm{d}x \\ &= \int_{x = 0}^t P_X(x) \cdot 1 \,\mathrm{d}x - \int_{x = 0}^t P_X(x) \cdot F_Y(x) \,\mathrm{d}x \\ &= F_X(t) - \int_{x = 0}^t M \beta e^{-\beta x} \left( 1 - e^{-\beta x} \right)^M \cdot \left( 1 - e^{-\beta x} \right)^N \,\mathrm{d}x \end{align*} Since in the end we will take the derivative with respect to $t$ to obtain the density as in $$P_Z(t) = \frac{ \mathrm{d}F_Z(t) }{ \mathrm{d} t} = \color{blue}{ \frac{ \mathrm{d}\Pr\{ X < t ~~\&~~ Y> X \} }{ \mathrm{d} t} } + \frac{ \mathrm{d}\Pr\{ Y < t ~~\&~~ X > Y \} }{ \mathrm{d} t} \tag{2} \label{Eq02}$$ we might as well do it now for the above-diagonal piece. $$\color{blue}{ \frac{ \mathrm{d}\Pr\{ X < t ~~\&~~ Y> X \} }{ \mathrm{d} t}} = P_X(t) - \color{magenta}{\frac{ M }{ M+N } } (M+N) \beta e^{-\beta t} \left( 1 - e^{-\beta t} \right)^{M+N} \tag{3} \label{Eq03}$$ After the leading $P_X(t)$ we get the $\color{magenta}{\text{scaled}}$ version of the density of the maximum of $(M+N)$ exponential i.i.d. with rate $\beta$.

Similarly, that the second piece of integral in Eq.\eqref{Eq01} for the below-diagonal region will give: $$\frac{ \mathrm{d}\Pr\{ Y < t ~~\&~~ X> Y \} }{ \mathrm{d} t} = P_Y(t) - \color{magenta}{\frac{ N }{ M+N } } (M+N) \beta e^{-\beta t} \left( 1 - e^{-\beta t} \right)^{M+N} \tag{4} \label{Eq04}$$

Put Eq.\eqref{Eq03} and Eq.\eqref{Eq04} them together according to Eq.\eqref{Eq02} we get a full piece. $$P_Z(t) = P_X(t) + P_Y(t) - P_T(t) \quad \text{where} \quad P_T(t) = (M+N) \beta e^{-\beta t} \left( 1 - e^{-\beta t} \right)^{M+N} \tag{5} \label{Eq05}$$

That is, the density of $Z = \min\{X,Y\}$ is the sum of the densities of $X$ and $Y$ minus the density of $T$, where $T$ is the maximum of $(M+N)$ exponential i.i.d. with rate $\beta$

There's actually a heuristic argument for this "sum then minus" of densities. Note that here we have only two things $X$ and $Y$, therefore $$X + Y = \min\{X, Y\} + \max\{X, Y\} $$ Many nice properties come out of this fact, and there are some nice example for the discrete distribution as well. In particular, consider the infinitesimal probability mass between (the dummy) $t$ to $t+ \mathrm{d}t$. With notations $Z = \min\{X, Y\}$ and $T = \max\{X, Y\}$, we have $$\Pr\{ t < X < t+ \mathrm{d}t \} + \Pr\{ t < Y < t+ \mathrm{d}t \} = \Pr\{ t < Z < t+ \mathrm{d}t \} + \Pr\{ t < T < t+ \mathrm{d}t \} \tag{6} \label{Eq06}$$ Intuitively, the tiny probability mass is proportional to the density as in $$\Pr\{ t < X < t+ \mathrm{d}t \} = P_X(t)\mathrm{d}t \tag*{, similar for $Y$, $Z$, and $T$.}$$ Therefore, Eq.\eqref{Eq06} becomes an equation of the densities. $$P_X(t) + P_Y(t) = P_Z(t) + P_T(t) \\ P_Z(t) = P_X(t) + P_Y(t) - P_T(t)$$ This can be considered an explanation for Eq.\eqref{Eq05} as well as an intuitive shortcut to derive it.