0

Let $X_i\sim exp(1)$ $i=1,...,n$ be $n$ independent random variables with the exponential distribution. Let $X_{(n)}$ be the random variable defined by $X_{(n)}=Max(X_1,...,X_n)$.

It follows easily that the distribution of $X_{(n)}$ is $F_{X_{(n)}}(x)=(1-e^{-x})^n$ and the density is $f_{X_{(n)}}(x)=n(1-e^{-x})^{n-1}e^{-x}$.

I tried to calculate the expected value of $X_{(n)}$ by integrating the density but got stuck, and ended up calculating the expected value by using the fact that $E(x)=\int (1-F_{X_{(n)}}(x)) dx$ (using that $X_{(n)}$ is non-negative).

I was wondering, if this idea can be used in some way to calculate $E(X^2_{(n)})$ or if there is another way to calculate this moment in order to be able to find an expression for the variance.

3 Answers3

4

I claim that $$\operatorname{E}[X_{(n)}] = H_n^{(1)}, \quad \operatorname{Var}[X_{(n)}] = H_n^{(2)},$$ where $$H_n^{(m)} = \sum_{k=1}^n \frac{1}{k^m}$$ is the harmonic number of order $m$. When $m = 1$, we may choose to omit the order and write $H_n$.

As already established, $$F_{X_{(n)}}(x) = (1-e^{-x})^n, \quad x > 0.$$ We recall that for a nonnegative random variable $X$, $$\operatorname{E}[X] = \int_{x=0}^\infty (1 - F_X(x)) \, dx, \quad \operatorname{E}[X^2] = \int_{x=0}^\infty 2x (1 - F_X(x)) \, dx.$$ Consequently:

$$\begin{align*} \operatorname{E}[X_{(n)}] &= \int_{x=0}^\infty 1 - (1-e^{-x})^n \, dx \qquad [x = -\log(1-u), \; dx = (1-u)^{-1} \, du] \\ &= \int_{u=0}^1 \frac{1-u^n}{1-u} \, du \\ &= \int_{u=0}^1 \sum_{k=0}^{n-1} u^k \, du \\ &= \sum_{k=0}^{n-1} \left[\frac{u^{k+1}}{k+1}\right]_{u=0}^1 \\ &= H_n^{(1)}. \end{align*}$$ For the second moment, the same substitution yields $$\begin{align*} \operatorname{E}[X_{(n)}^2] &= 2 \int_{u=0}^1 \sum_{k=0}^{n-1} u^k (-\log(1-u)) \, du \\ &= 2 \int_{u=0}^1 \sum_{k=0}^{n-1} \sum_{j=1}^\infty \frac{u^{k+j}}{j} \, du \\ &= \sum_{k=0}^{n-1} \sum_{j=1}^\infty 2 \left[\frac{u^{k+j+1}}{j(k+j+1)} \right]_{u=0}^1 \\ &= \sum_{k=1}^n \frac{2}{k} \sum_{j=1}^\infty \left(\frac{1}{j} - \frac{1}{j+k}\right) \\ &= \sum_{k=1}^n \frac{2}{k} H_k. \end{align*}$$ Hence $$\begin{align*} \operatorname{Var}[X_{(n)}] &= \sum_{k=1}^n \frac{2H_k}{k} - \left(\sum_{k=1}^n \frac{1}{k}\right)^2 \\ &= \sum_{k=1}^n \sum_{j=1}^k \frac{2}{jk} - \sum_{k=1}^n \sum_{j=1}^n \frac{1}{jk} \\ &= \left(\sum_{k=1}^n \frac{2}{k^2} + \sum_{k=1}^n \sum_{j=1}^{k-1} \frac{2}{jk} \right) - \left( \sum_{k=1}^n \frac{1}{k^2} + \sum_{k=1}^n \sum_{j=1}^{k-1} \frac{2}{jk} \right) \\ &= \sum_{k=1}^n \frac{1}{k^2} \\ &= H_n^{(2)}, \end{align*}$$ as claimed.

heropup
  • 135,869
  • Do you have any reference or intuition regarding $\int 2x(1-F_X(x))dx$, this was precisely the piece I was looking for. – Daniel Ordoñez Apr 05 '19 at 12:49
  • 1
    @DanielOrdoñez A proof using integration by parts is possible using a similar line of reasoning as for the formula for expectation as the integral of survival: choose $u = x^2$, $du = 2x , dx$, $dv = f_X(x) , dx$, $v = F_X(x) , dx$ applied to $\int_{x=0}^\infty x^2 f_X(x) , dx$. – heropup Apr 05 '19 at 17:37
  • How is the term $F_X(x)x^2$ evaluated at $\infty$ not a problem? – Daniel Ordoñez Apr 05 '19 at 17:47
1

Instead of computing a single moment, it will be simpler to compute all moments through the MGF (moment generating function).

Let $X = X_{(n)}$. For any $t \in (-1,0)$, we have

$$\begin{align} \verb/MGF/[X] \stackrel{def}{=} \verb/E/[e^{tX}] &= \int_0^\infty e^{tx} d (1-e^{-x})^n\\ \color{blue}{\text{ int. by part } \rightarrow} &= -t\int_0^\infty (1-e^{-x})^n e^{tx} dx = t\sum_{k=0}^n (-1)^{k-1}\binom{n}{k}\int_0^\infty e^{-(k-t)x}dx\\ &= \sum_{k=0}^n (-1)^{k-1} \binom{n}{k}\frac{t}{k-t}\\ &= 1 + \sum_{k=1}^n (-1)^{k-1} \binom{n}{k}\sum_{\ell=1}^\infty \frac{t^\ell}{k^\ell}\\ &= 1 + \sum_{\ell=1}^\infty t^\ell \sum_{k=1}^n \frac{(-1)^{k-1}}{k^\ell}\binom{n}{k} \end{align} $$ On the other hand,

$$E[e^{tX}] = E\left[1 + \sum_{\ell=1}^\infty \frac{t^\ell}{\ell!}X^\ell\right] = 1 + \sum_{\ell=1}^\infty \frac{t^\ell}{\ell!} E[X^\ell]$$ By comparing coefficients of $t^\ell$ for $\ell \ge 1$, the $\ell^{th}$ moments of $X$ follows:

$$E[X^\ell] = \ell! \sum_{k=1}^n \frac{(-1)^{k-1}}{k^\ell}\binom{n}{k}$$

Update

Amazed by heropup's elegant expression for the variance, I look at the problem again. It turns out there is another generating function which significantly simplify the task.

The CGF (cumulant-generating function) is the natural logarithm of MGF:

$$\verb/CGF/(t) \stackrel{def}{=} \log \verb/MGF/(t) = \log \verb/E/[e^{tX}]$$ In terms of CFG, the mean and variance are given by the formula:

$$\verb/E/[X] = \verb/CGF/'(0)\quad\text{ and }\quad \verb/Var/[X] = \verb/CGF/''(0)$$ Changing variable from $x$ to $u = e^{-x}$ in above integral of MGF and keeping $t \in (-1,0)$, we have

$$\begin{align} e^{\verb/CGF/(t)} & = -t \int_0^1 (1-u)^n u^{-t-1} du\\ &= -t \frac{\Gamma(n+1)\Gamma(-t)}{\Gamma( n+1-t)} = (-t)\frac{n!}{(-t)(-t+1)\cdots(-t + n)}\\ &= \prod_{k=1}^n\frac{k}{k-t}\end{align}$$ This leads to

$$\begin{align} \verb/CGF/(t) &= \sum_{k=1}^n - \log\left(1 - \frac{t}{k}\right)\\ \implies \verb/CGF/'(t) &= \sum_{k=1}^n \frac{1}{k-t}\\ \implies \verb/CGF/''(t) &= \sum_{k=1}^n \frac{1}{(k-t)^2} \end{align}$$ From this, the mean and variance follow immediately...

$$\begin{align} \verb/E/[X] &= \verb/CGF/'(0) = \sum_{k=1}^n \frac{1}{k}\\ \verb/Var/[X] &= \verb/CGF/''(0) = \sum_{k=1}^n \frac{1}{k^2} \end{align}$$

achille hui
  • 122,701
0

These people reckon the $1-$CDF idea is good. https://www.stat.berkeley.edu/~mlugo/stat134-f11/exponential-maximum.pdf

So that being said:

Set $u = e^{-x}$. $du/dx = e^{-x} = -u$.

We have $\int_{0}^{1} \frac{-1}{u}+\frac{(1-u)^n}{u} du = \lim_{x\to 0}(\ln(x)+B(x,n+1))$ where $B$ is the Beta function.

How can we compute this limit you may ask, and that's a good question.

fGDu94
  • 3,916
  • 1
    The somewhat lengthy computation of the mean in that note can be avoided by using a substitution $$\int_0^\infty 1-(1-e^{-x})^n \mathrm{d}x=\int_0^1 \tfrac{1-u^n}{1-u} \mathrm{d}u=\int_0^1 1+u+\dotsc +u^{n-1}\mathrm{d}u=1+1/2+1/3+\dotsc + 1/n,$$ where we've used the finite geometric summation formula, by the way. I tried to look at the variance too but it seemed unwieldy. – Nap D. Lover Apr 05 '19 at 01:24