2

Let $X_1, \ldots, X_K$ be $K$ i.i.d. exponential r.v.s with parameter $\lambda$. The distributions of $S=\sum_{k=1}^K X_k$ and $M=\max\{X_k\}$ are well known. We are interested in the distribution of $M$ given $S$, i.e., in deriving the expression of the function: $$ F_{M\mid S}(m\mid s)=\Pr\{M\leq m\mid S=s\} $$ Clearly, a Laplace transform approach is ok, any idea about how to proceed or to approximate $F_{M\mid S}$?

haran76
  • 53
  • 1
    "Exponential distribution with parameter $\lambda$" sometime means $$ e^{-x/\lambda} (dx/ \lambda) \text{ for }x\ge0 $$ and sometimes means $$ e^{-\lambda x} (\lambda,dx) \text{ for } x\ge0. $$ Which one is intended here? – Michael Hardy Jul 30 '17 at 00:36
  • Sorry for being unclear, the second definition is the one I am considering, so that the expected value is $\lambda^{-1}$. Thanks for pointing out this. – haran76 Jul 30 '17 at 00:38
  • I remember answering this question here but can't find it. Maybe it was deleted? No, I can't find it even in deleted posts. Strange. – zhoraster Aug 04 '17 at 09:57
  • Oh found it. See here https://math.stackexchange.com/questions/1968532/probability-of-longest-head-run-when-p-rightarrow-1/1975205, formula for $f_k$. Though it's a bit different from the one written by @haran76 ($k$ is $k+1$, but this is not the only difference). – zhoraster Aug 04 '17 at 10:07
  • Ah but it the same as @haran76, just the summation index is different too. – zhoraster Aug 04 '17 at 10:17

2 Answers2

3

HINT

For iid exponentials, regardless of $\lambda,$ the joint density of $X_1,\ldots, X_n$ given the sum $S$ is given by $$ f_{X_i\mid S}(x_1,\ldots,x_n\mid S=s) = \frac{(n-1)!}{s^{n-1}}\delta_{\sum_i x_i,s},$$ i.e. the point $(X_1,\ldots,X_n)$ is conditionally uniformly distributed on the surface $\sum_i x_i = s.$

From this, you can compute$$P(M\le m\mid S=s) = P(X_1\le m,\ldots, X_n\le m\mid S=s) $$ as the area of the surface contained within the cube of side length $m$.

  • Thanks for the help, but I have still have some concerns. $f_{X_i|S}$ is uniform on the surface $\sum_i x_i=s$, so shouldn't the first factor $(n-1)!/s^{n-1}$ be the reciprocal of the volume of the simplex on $n-1$ dimensions? Finally, any suggestion about how to find the area of the surface contained within the cube of side length $m$, when $n>3$? – haran76 Jul 30 '17 at 18:02
  • Sorry, ignore the first part of the comment. – haran76 Jul 30 '17 at 18:54
  • @haran76 Friend of yours?https://math.stackexchange.com/questions/2378315/intersection-hypercube-with-simplex In any event, there's no answer yet. – spaceisdarkgreen Aug 02 '17 at 00:25
  • I'll confess I don't know of a way to make it an easy problem from here... the result is a piecewise polynomial with different forms for $m>s/2$, $s/2>m>s/3,$ $s/3>m>4$, etc. (Think about how the intersection changes as the cube gets bigger. There's an inclusion-exclusion type thing going on.) There is a closed form. The $n=3$ case has all the essential complication, I think, although the difficulty of visualizing $n=4$ might make generalizing hard. – spaceisdarkgreen Aug 02 '17 at 00:35
  • I'll be nice and just give the answer $$P(M\le m|S=s) = \sum_{k=2}^n {n\choose k} (-1)^{n-k} (km/s-1)^{n-1}I(km\ge s)$$ – spaceisdarkgreen Aug 03 '17 at 02:53
  • Thanks, I abandoned the route that you suggested and left it to a colleague. I've taken a probabilistic approach based on Bayes theorem. Basically, I am characterizing the conditional probability of the sum of the lowest $K-1$ order statistics given the maximum. However, what I obtained is much more complicated that what you get here. Since I'd like to understand, and it seems that it doesn't come from the geometric approach you suggested, would you be so kind to give some reference? – haran76 Aug 03 '17 at 04:58
  • Regarding, the problems with the previous approach, the difficulties come with $n=3$ if $n$ is the dimension of the simplex, with $n=4$ is the $n$ is the number of variables. We were thinking to devise an algorithm to get the intersection points, but if the solution is so simple, then much better! – haran76 Aug 03 '17 at 04:59
  • I don't know of a reference. This is the first I've seen this problem. The answer I gave I got from the the geometric approach I described and will (hopefully) clarify: Think about it for $n=3$ (i.e. three exponentials). Look at the surface $x_1+x_2+x_3 = s$ head on. The cross-section is a growing equilateral for $m<s/2$ then at $s/2$ the triangle touches the edges (looks like a triforce from zelda :) ) and the three equilateral triangle corners that begin to poke out as $m$ increases beyond $s/2$ stop contributing. So you get $(3m/s-1)^2$ for $m<s/2$ and $(3m/s-1)^2-3(2m/s-1)^2$ for $m>s/2.$ – spaceisdarkgreen Aug 03 '17 at 05:45
  • Hopefully it's apparent from there how to generalize to higher $n$ and arrive at the formula. If you aren't already doing it, set $s=1$... saves trouble. If you find an algebraic approach that works (perhaps some good choice of coordinates), or the order statistics works out, i'd be interested. – spaceisdarkgreen Aug 03 '17 at 05:52
  • Hi, I think I have the algebraic approach, and after simplifications I have a solution very close to yours. I say very close because I have the p.d.f. while you have the c.d.f.; I think it's a little bit heavy to write it down here, I'll latex it and send it to you somehow if you are interested. – haran76 Aug 03 '17 at 18:42
  • sure, if you don't mind, send it to my name at gmail or give a link to it or whatever – spaceisdarkgreen Aug 03 '17 at 22:21
1

Let me try to put it here (I slightly change the notation and $\lambda=1$): Let $X_1, \ldots, X_K$ be the order statistics of the i.i.d. exponential random variables, i.e., $X_1<X_2<\ldots<X_K$. Then, let us find the distribution of $T|X_K$, where $$ T=\sum_{i=1}^{K-1}X_i $$ Then, the Laplace-transform $F^*_{T|X_K}$ is given by: $$ F^*_{T|X_K}(s)= \frac{1}{(1-e^{-m})^{K-1}} \left(\int_0^m e^{-sx} e^{-x}dx\right)^{K-1} =\frac{1}{(1-e^{-m})^{K-1}}\left(\frac{1-e^{-m(1+s)}}{1+s}\right)^{K-1} $$ whose inversion leads to: $$ f_{T|X_K}(t|m)=\frac{1}{(1-e^{-m})^{K-1}}\frac{e^{-t}}{(K-2)!} \sum_{i=1}^K(-1)^{i+1}\binom{K-1}{i-1}(t-(i-1)m)^{K-2} \delta_{t\geq (i-1)m}\,, $$ where $\delta_P$ is $1$ if $P$ is true, $0$ otherwise. Now the game is done, because the distribution of $X_K$ is: $$ f_{X_K}(m)=K (1-e^{-m})^{K-1}e^{-m} $$ and the distribution of $S=\sum_{i=1}^K X_i$ is a gamma distribution: $$ f_S(x)=\frac{e^{-x}x^{K-1}}{(K-1)!}\,. $$ Now, we are in a position to apply Bayes's theorem and obtain the desired result: $$ f_{X_K|S}(m|x)=\frac{f_{T|X_K}(x-m|m)f_{X_K}(m)}{f_S(x)} =\frac{K(K-1)}{x^{K-1}} \sum_{i=1}^K(-1)^{i+1}(x-m i) ^{K-2}\binom{K-1}{i-1}\delta_{x \geq im} $$ As sanity check, let us take $K=2$ and we obtain: $$ f_{X_2|S}(m|x)= \begin{cases} 0 &\text{ if } x<m \\ 2/x &\text{ if } m \leq x <2m\\ 0 &\text{ if } x \geq 2m \end{cases} $$ whose c.d.f. is: $$ F_{X_2|x}(m|x)= \begin{cases} 0 & \text{ if } m<x/2\\ \frac{2 m -x}{x} & \mbox{ if } x/2 \leq m <x\\ 1 & \mbox{ if } m\geq x \end{cases} $$ that corresponds to the answer one obtains by applying the geometric approach.

haran76
  • 53