2

Consider a random variable $Z$ defined as the sum of two independent copies of the uniform random variable on $[0,1]$. Let $Y$ be the minimum of $n$ independent copies of $Z$. What is the expectation and standard deviation of $Y$?

In case an exact answer will be hard to come by, I would be really happy with estimates which are correct up to a constant.

robinson
  • 1,918
  • 2
    http://en.wikipedia.org/wiki/Order_statistic#The_order_statistics_of_the_uniform_distribution –  Jun 23 '11 at 00:01
  • @Mike: Point taken, but I feel an answer should contain some explanation and elaboration (e.g. specialization to the 1st order statistic, the expectation and variance of the corresponding beta distribution, and so on) which I haven't the energy for at the moment. If someone else wants to take this, flesh it out, and post it as a good answer, please go ahead. –  Jun 23 '11 at 00:08
  • To clarify: while its not hard to derive an expression for the distribution of $Y$, I'm having trouble working out the integrals for its mean and variance. – robinson Jun 23 '11 at 00:48

2 Answers2

2

Hint: $P(Y > x) \equiv P(Z_1 > x) \dots P( Z_n > x)$.

(If the smallest element is greater than some number, so must every single element be greater than that number. The product follows from the independence assumption.)

Emre
  • 2,783
  • Perhaps I'm doing it wrong, but I got stuck trying to work out the integrals that arise from doing this. Namely, using $E[Y]=\int P(Z>x)^n$, led me to trying to compute $\int_0^1 (1-(1/2)x^2)^n$, which I'm not sure how to do. – robinson Jun 23 '11 at 00:46
  • Let $t=x^2/2$ and compare with the definition of the beta function. – Emre Jun 23 '11 at 01:09
2

This problem is more difficult than I had thought at first. :)

I'm assuming you know that the pdf of $Z$ has the triangular shape defined by $f(z) = z$, if $0 \leq z \leq 1$, and $f(z) = 2-z$ if $1 \leq z \leq 2$. Thus $P(Z > z) = 1 - z^2/2$, if $0 \leq z \leq 1$, and $P(Z > z) = 2 - 2z + z^2/2$, if $1 \leq z \leq 2$.

Then, as you note, $E[Y] = \int_0^2 P(Z > z)^n dz = \int_0^1 (1 - z^2/2)^n dz + \int_1^2 (2 - 2z + z^2/2)^n dz$. The second integrand factors into $2^{-n} (2-z)^{2n}$, which means the second integral evaluates to $2^{-n}/(1+2n)$.

To evaluate the first integral, it helps to consider $\int_0^\sqrt{2} (1 - z^2/2)^n dz$. Since $1-z^2/2$ is decreasing, the error introduced by extending the region of integration like this is smaller than $2^{-n}(\sqrt{2} - 1) \leq 2^{-(n+1)}$. Then, use the transformation $t=z^2/2$ suggested by Emre. This yields $$\int_0^\sqrt{2} (1 - z^2/2)^n dz = \frac{1}{\sqrt{2}}\int_0^1 t^{-1/2} (1 - t)^n dt = \frac{1}{\sqrt{2}} B(1/2,n+1) = \frac{1}{\sqrt{2}} \frac{\Gamma(1/2) \Gamma(n+1)}{\Gamma(n+3/2)}$$ $$= \frac{\Gamma(1/2) \Gamma(n+1)}{\sqrt{2}(n+1/2)\Gamma(n+1/2)},$$ where $B(x,y)$ is the Beta function, as noted by Emre.

Now, $\Gamma(1/2) = \sqrt{\pi}$, and (see, for example, here) $$\frac{\Gamma(n+1)}{\Gamma(n+1/2)} = \frac{{4^n }}{{{2n \choose n}}\sqrt \pi},$$ which gives $$E[Y] = \frac{{4^n }}{\sqrt{2}(n+1/2){{2n \choose n}}} + O(2^{-n}).$$

You can get a simpler expression while losing some precision by using Gautschi's inequality. This gives us $$\sqrt{n} < \frac{\Gamma(n+1)}{\Gamma(n+1/2)} < \sqrt{n+1} \Rightarrow \frac{\Gamma(n+1)}{\Gamma(n+1/2)} = \sqrt{n} + O\left(\frac{1}{\sqrt{n}}\right).$$ This implies $$E[Y] = \frac{\sqrt{\pi n}}{\sqrt{2} (n+1/2)} + O\left(\frac{1}{n^{3/2}}\right) = \frac{\sqrt{\pi}}{\sqrt{2n}} + O\left(\frac{1}{n^{3/2}}\right).$$


For the standard deviation of $Y$, use $SD[Y] = \sqrt{Var[Y]}$, $Var[Y] = E[Y^2] - E[Y]^2$, and $E[Y^2] = 2\int_0^2 z (P(Z > z))^n dz$. (For a proof of the last, see here.) The calculation of the integral for $E[Y^2]$ is much easier than that for $E[Y]$ and comes to $$E[Y^2] = \frac{2 + 2^{-n+1} + 4 n}{1 + 3 n + 2 n^2} = \frac{2}{n+1} + O\left(2^{-n}\right).$$ For example, if we use this with the simpler approximation for $E[Y]$, we obtain $$SD[Y] = \frac{\sqrt{2 - \pi/2}}{\sqrt{n}} + O\left(\frac{1}{n^{3/2}}\right).$$
Mike Spivey
  • 55,550
  • OMG, this is awesome. I was trying to do these calculations but I got stuck. Your answer looks like it took a lot of work - thanks so much!!!! – robinson Jun 23 '11 at 22:23
  • @robinson: You're quite welcome. Trying to figure out problems like this is fun for me. :) And it wasn't as much work as it might look like; I've worked with a lot of this stuff before. The key insight (thanks to Emre) was to transform the difficult integral into a beta function. – Mike Spivey Jun 24 '11 at 01:48