4

Let $\mu$ be a non-atomic probability measure on $[0,\infty)$ and sample $X_1,X_2$ from $\mu$ independently. Does $\min(X_1,X_2)$ have twice as many moments as $X_1$? Is the quantity $$ \frac{\mathbb E\min(X_1,X_2)}{\left(\mathbb E \sqrt{X_1}\right)^2} $$ bounded away from $0$ and $\infty$?

More generally, does $\min(X_1,\ldots,X_n)$ have $n$ times as many moments as $X_1$? Moreover is $$ \frac{\mathbb E\min(X_1,\ldots,X_n)}{\left(\mathbb E \sqrt[n]{X_1}\right)^n} $$ bounded away from $0$ and $\infty$?

For nice distributions, the identity $\mathbb E X=\int \mathbb P(X>x)\; d\mu(x)$ allows us to reformulate the general versions as follows: $$ \|\mathbb P(X_1>t)\|_n\approx\|\mathbb P(X_1>t^n)\|_1, $$ where $\approx$ means bounded by constants.

pre-kidney
  • 30,223
  • I think the word "atomless" is better than "non-atomic" because the latter can get mistaken for "not atomic". Merely not being atomic is a far weaker condition than atomlessness. ${}\qquad{}$ – Michael Hardy Jan 15 '16 at 17:22
  • That is interesting. Empirically, if $X_i$ takes the absolute value of a standard Cauchy distribution, the quotient for $n=2$ appears to be somewhere between $0.62$ and $0.63$, for $n=3$ about $0.44$, for $n=4$ about $0.34$ – Henry Jan 15 '16 at 18:20
  • @Henry's comment applies to an older version of the question before I fixed the scaling in the denominator. In the new version of the question, the ratio for $n=2$ is precisely $2/\pi$ for all exponential distributions. For uniform $[a,b]$ distributions it is a more complicated expression that has a maximum value of $3/2$ and a minimum value of $3/4$. – pre-kidney Jan 15 '16 at 23:51
  • In fact I had a mistake in my calculations, $3/2$ is not attained. See below for the nicer upper bound... – pre-kidney Jan 16 '16 at 02:36
  • Moreover, the atomless/non-atomic condition can be dropped if we simply assume that $X_1>0$ almost surely. – pre-kidney Jan 16 '16 at 02:42

3 Answers3

2

If the $X_k$ are exponential with parameter $\lambda>0$, then $$ {\Bbb E[\min(X_1,X_2)]\over\Bbb E\sqrt{X_1}}={\sqrt{\pi}\over 2\sqrt{\lambda}}. $$ With thoughts of scaling in my head, let me suggest that perhaps you want to consider$$ {\Bbb E[\min(X_1,X_2)]\over\left[\Bbb E\sqrt{X_1}\right]^2} $$

John Dawkins
  • 25,733
2

Yes - $m_n$ has at least $n$ times as many moments as $X$. $E(g(X))=\int_0^\infty g'(t)P(X>t)\,dt$, hence $$E(X^{k})=\int_0^\infty k t^{k-1} P(X>t)\,dt\tag 1$$ while $$E(m_n^{nk})=\int_0^\infty nk t^{nk-1}P^n(X>t)\,dt\tag 2$$

If $(1)$ converges then $h(t)=t^kP(X>t)=o(1)$ which implies $h^n=O(h)$ which implies $(2)$ converges.

Your "i.e." doesn't apply though, since $E(m_n)\to \min \operatorname{supp}(\mu)$ while $(EX^{1/n})^n\to\exp(E\log X)$. Their ratio is bounded away from $\infty$ but not $0$. The most interesting case is $E\log X=-\infty$ (which implies $0\in \operatorname{supp}(\mu)$)

A.S.
  • 4,004
1

Let $X>0$ be a random variable and consider an i.i.d. sequence $X_1,\ldots,X_n$ sampled from $X$.

The following inequality holds and the constants are optimal as $n\to\infty$: $$ 0\leq \frac{\mathbb E\min(X_1,\ldots,X_n)}{\left(\mathbb E \sqrt[n]{X}\right)^n}\leq 1 $$

Proof: Note that $\min(X_1,\ldots,X_n)\leq \sqrt[n]{X_1\cdots X_n}$. Taking expectations and applying independence yields the upper bound. The upper bound cannot be improved, as one sees by letting $X$ tend to a deterministic random variable.

The lower bound cannot be improved: consider the case $X\sim U[0,1/m]$. Note that $$ \mathbb E\sqrt[n]{X}=m\int_0^{1/m}x^{1/n}\; dx=\frac{m^{-1/n}}{1+\frac{1}{n}} \geq (me)^{-1/n}, $$ where we have used the Taylor inequality $e^{1/n}\geq 1+1/n$.

Moreover, $\mathbb E\min(X_1,\ldots,X_n)=1/m(n+1)$ . Therefore $$ \frac{\mathbb E\min(X_1,\ldots,X_n)}{\left(\mathbb E \sqrt[n]{X}\right)^n}\leq \frac{1/m(n+1)}{(me)^{-1}}=\frac{e}{n+1}\to 0. $$ Thus we have shown that as $n\to\infty$, there exists a sequence of random variables $U[0,1/m]$ for which the ratio is arbitrarily small.

Note: For fixed $n$, the best lower bound is unknown. For $n=2$ I can get $1/2$ using a pole at $0$ blowing up like $x^{\epsilon-1}$. Can anyone do better?

pre-kidney
  • 30,223
  • Just blow it up more: $F(x)=\exp(o(\log x))$ around $0$. – A.S. Jan 16 '16 at 08:01
  • For example $F(x)=(-\log x)^{-1}$ – A.S. Jan 16 '16 at 08:37
  • Maybe I made a mistake in my calculations, but for the random variable with $P(X\leq t)=-1/\log t$ for $t\leq 1/e$ I actually get a slightly larger value of the ratio - approximately $0.73$ in the case $n=2$. Is that what you're getting as well? – pre-kidney Jan 17 '16 at 02:29
  • I should point out that the random variable $P(X\leq t)=-1/\epsilon\log t$ on $[0,e^{-1/\epsilon}]$ approaches $1/2$ as well when $\epsilon\to 0$ and $n=2$. So it looks to me like $1/2$ is the true lower bound when $n=2$. – pre-kidney Jan 17 '16 at 03:53