1

Let the random variables $\xi_1, \xi_2, \ldots, \xi_n$ be independent, have an exponential distribution and expectation of each is 2.

Are $\sum_{i=1}^{n} \xi_i / i$ and $\max\xi_n$ equally distributed?

2 Answers2

1

Claim: the statement is true and may be generalized easily for any order statistic from exponential distribution with any $\lambda$, not only $\lambda = 2$.

At first, let us note that it's sufficient to consider $\lambda = 1$ since $\frac1{\lambda}$ is a scale parameter.

See proposition 3 p.55 here:1 https://doi.org/10.1214/aoms/1177699058

So, the claim is true and may be showed by direct computation.

Moreover, $$X_{(j)} = \sum_{i=1}^{j} \frac{\xi_i}{n-i+1}$$ in distribution.

1Chernoff, H.; Gastwirth, Joseph L.; Johns, M. V. jun., Asymptotic distribution of linear combinations of functions of order statistics with applications to estimation, Ann. Math. Stat. 38, 52-72 (1967). ZBL0157.47701.

Botnakov N.
  • 5,660
1

Yes. The argument in this blog post shows that they are equal in mean, but it can be extended to show they are actually equal in distribution.

Let $a(1),\dots,a(n)$ be the (random) ordering of the indices such that $\xi_{a(1)}\leq \xi_{a(2)}\leq \cdots \leq \xi_{a(n)}$. Then $$\max(\xi_1,\dots,\xi_n)=\xi_{a(n)}=\xi_{a(1)}+(\xi_{a(2)}-\xi_{a(1)})+\cdots+(\xi_{a(n)}-\xi_{a(n-1)}).$$ Since $\xi_{a(1)}$ is the minimum of $n$ independent $\text{Exp}(\lambda)$ random variables, its distribution is $\text{Exp}(n\lambda)$.

Since $\xi_{a(1)}$ is the smallest of the $\xi_i$, we have that $\xi_{a(2)}-\xi_{a(1)}$ is the minimum of $n-1$ independent $\text{Exp}(\lambda)$ random variables, namely $$\xi_{a(2)}-\xi_{a(1)}=\min_{i \neq a(1)}(\xi_{i}-\xi_{a(1)}).$$ Note that $(\xi_i-\xi_{a(1)})\sim\text{Exp}(\lambda)$ for $i \neq a(1)$ by the memoryless property of the exponential distribution. Thus $\xi_{a(2)}-\xi_{a(1)}$ has the distribution $\text{Exp}((n-1)\lambda)$, and it is independent of $\xi_{a(1)}$ by the memoryless property.

Next we have $\xi_{a(3)}-\xi_{a(2)}$ is the minimum of $n-2$ independent $\text{Exp}(\lambda)$ random variables: $$\xi_{a(3)}-\xi_{a(2)}=\min_{i \neq a(1),a(2)}(\xi_{i}-\xi_{a(2)})$$ and so has the distribution of $\text{Exp}((n-2)\lambda)$. It is independent of $\xi_{a(1)}$ and $\xi_{a(2)}-\xi_{a(1)}$ by the memoryless property.

Continuing this argument, we have that $\max(\xi_1,\dots,\xi_n)$ is the sum of $n$ independent exponential random variables with rates $n\lambda,(n-1)\lambda,\dots,\lambda$.

On the other hand, $\frac{\xi_i}{i}$ is an exponential random variable with rate $\lambda i$. Since the $\xi_i$ are independent, $\sum_{i=1}^n \frac{\xi_i}{i}$ is the sum of $n$ independent exponential random variables with rates $\lambda, 2\lambda, \dots, n\lambda$. So we conclude that $\max(\xi_1,\dots,\xi_n)$ and $\sum_{i=1}^n\frac{\xi_i}{i}$ are equal in distribution.

kccu
  • 20,808
  • 1
  • 22
  • 41
  • What is the sense of statement: $(\xi_i-\xi_{a(1)})\sim\text{Exp}(\lambda)$ for $i \neq a(1)$ ? As $a(1)$ is random, what do you mean? Maybe you meant smth.like conditional distribution in case $i \neq a(1)$, but in this case it's not a memoryless property and should be proved. – Botnakov N. Jan 10 '21 at 16:00
  • Conditional on $a(1)=j$ (i.e., conditional on $\xi_j = \min(\xi_1,\dots,\xi_n)$), $(\xi_i-\xi_j) \sim\text{Exp}(\lambda)$ for $i \neq j$, and these are independent of each other and of $\xi_j$. – kccu Jan 10 '21 at 16:16
  • It's not a memoryless property. Put $j=2$, $n=3$, $\xi = \xi_1$, $\eta =\xi_2$, $\zeta = \xi_3$. You say that $P(\xi_1 \ge \eta+x, \zeta \ge \eta +y|\min(\xi, \zeta) \ge \eta) = P(\xi \ge x) P(\zeta \ge y)$. Why do you say that it is a memoryless property? Maybe you want to derive it from smth.like $$P(\xi_1 \ge a+x, \zeta \ge a +y|\min(\xi, \zeta) \ge a) = P(\xi \ge x) P(\zeta \ge y) (1),$$ but it was not derived. Moreover, it's still not a memoryless property by definition. Because there are 2 r.v.in it, but not 1. So (1) also should be derived. Or what do you imply by "memoryless property"? – Botnakov N. Jan 10 '21 at 16:36
  • $\xi$ and $\zeta$ are conditionally independent given $\min(\xi,\zeta) \geq \eta$. So we can break up the joint conditional probability $P(\xi \geq \eta+x,\zeta \geq \eta+y\mid \min(\xi,\zeta) \geq \eta)$ as the product of $P(\xi \geq \eta+x\mid \min(\xi,\zeta) \geq \eta)$ and $P(\zeta \geq \eta+y \mid \min(\xi,\zeta) \geq \eta)$.

    The first of these simplifies to $P(\xi \geq \eta+x\mid \xi \geq \eta) = \int_0^\infty P(\xi \geq a+x \mid \xi \geq a) f_\eta(a) \ da = \int_0^\infty P(\xi \geq x) f_\eta(a) \ da = P(\xi \geq x)$. Similarly with the second term.

    – kccu Jan 10 '21 at 17:18
  • Conditional independence of $\xi$ and $\zeta$ in case $\min(\xi, \zeta) \ge \eta$ means that $$P(\xi > x, \zeta > y | \min(\xi, \zeta) \ge \eta) = P(\xi > x| \min(\xi, \zeta) \ge \eta) P(\zeta > y | \min(\xi, \zeta) \ge \eta) .$$ It's not hard to check if it's right but it's not a memoryless property. At least because there's only 1 r.v. in memoryless property. – Botnakov N. Jan 10 '21 at 17:43
  • Would you prefer I say it's an extension of the memoryless property or a consequence of it? As for the conditional independence, we can instead take the expectation over $\eta$ first:

    $$P(\xi \geq \eta+x,\zeta \geq \eta+y \mid \min(\xi,\zeta) \geq \eta) = \int_0^\infty P(\xi \geq a+x,\zeta \geq a+y \mid \min(\xi,\zeta) \geq a) f_\eta(a) \ da.$$ Then split up the probability using conditional independence.

    – kccu Jan 10 '21 at 18:11
  • Let us call it extension. Put $\gamma = (\xi, \zeta)$. If I understood you right, you say that $$P((\gamma, \eta) \in A | (\gamma, \eta) \in B) = \int_0^{\infty} P((\gamma, a) \in A | (\gamma, a) \in B) f_{\eta}(a)da {} (2) $$ (smth.like formula of total probability) and then derive the extension of memoryless property. If so, why does (2) hold true? It's not formula of total probability since it's not $$P((\gamma, \eta) \in A | (\gamma, \eta) \in B) = \int_0^{\infty} P((\gamma, a) \in A, \eta \in da | (\gamma, \eta) \in B) f_{\eta}(a)da. $$ If (2) is not true then the argument doesn't work. – Botnakov N. Jan 10 '21 at 22:06
  • See the derivation in this answer. You can adapt this to derive the formula (2). – kccu Jan 11 '21 at 01:21