0

I'm strugging to find a neat solution to the following problem.

Let $X_1,\dots,X_N \sim N(0,1)$ and let $X = \max_i X_i$. Show that

$\mathbb{E}[X] \geq c \sqrt{\ln N}$

for some absolute constant $c$.

Combining the lower tail bound for standard Gaussians

$\mathbb{P} (X_i > a) \geq \frac{1}{\sqrt{2 \pi} a + 2} e^{-a^2/2}$

with the total expecation theorem $\mathbb{E}[X] = \mathbb{E}[X | X \geq a] \mathbb{P}(X \geq a) + \mathbb{E}[X | X < a]\mathbb{P}(X < a)$, I managed to prove the inequaility but it requeired a lot of tedious calculations.

Is there a simple and neat way to prove that lower bound?

I have seen a related post Expectation of the maximum of gaussian random variables, where the distribution of $X$ is derived, however I was not able to bound from below the expectation integral.

Many thanks in advance.

Andrea
  • 103

1 Answers1

0

The way you said you proved the inequality is probably the only way to do it. It does not seem to require a lot of tedious calculations.

If $g$ is a standard Gaussian, then $$\hbox{Prob}\{g>t\}\geq (\frac{1}{t}-\frac{1}{t^3})e^{-t^2/2}$$ (See Feller, Vol. $1$, $3$rd edition, p. $175$) Hence there exists a positive $\beta>0$ (smaller than $1$) such that $$\hbox{Prob}\{g>\beta(\log n)^{1/2}\}\geq\frac{1}{n}$$ Now if $g_1,\dots,g_n$ are independent standard Gaussians, then using the previous inequality, we get $$\hbox{Prob}\{\max g_i \leq\beta(\log n)^{1/2}\}<(1-\frac{1}{n})^n\approx \frac{1}{e}$$ Hence $$\mathbb{E}(\max g_i)>\beta(\log n)^{1/2}\hbox{Prob}\{\max g_i>\beta(\log n)^{1/2}\}>(1-\frac{1}{e})\beta(\log n)^{1/2}$$

  • Many thanks, but what I don't like is that the constant $\beta$ (which I'm able to find) depends on $N$, while I would like to come up with a LB where $c$ is an absolute constant. By setting $\beta = \sqrt{2- \frac{2 \log \log N}{\log N}}$ I manged to prove the bound, but has I said there are quite a few calculations involved. Maybe there are simpler choice for $\beta$? – Andrea May 17 '21 at 09:39
  • Yes, take $\beta = 1/2$. This works for sufficiently large $n$ - how much large can be calculated - and then take care of the remaining finite set of $n$'s by a sufficiently large $c$. By the way, the $\beta$ you pointed out does not work. It has to be smaller than $1$. – uniquesolution May 17 '21 at 09:53
  • Thanks, you're right about the choice of $\beta$. Indeed, with the one I initially proposed I get a lower bound of the form $c_0 \sqrt{\log n} - c_0$. – Andrea May 24 '21 at 07:00
  • This might be obvious, but why is the first inequality in the last line where you wrote $\mathbb{E}\max g_i > \beta(\log n)^{1/2}Prob{ \max g_i > \beta(\log n)^{1/2} }$ true? Gaussian variables are not positive and we are not taking the absolute value here. – Partial T Feb 05 '24 at 19:54
  • 1
    @Partial T - You are correct to point out that there some latent details beneath the surface of my superficial presentation. However, note that the probability that the maximum of $n$ independent Gaussians is negative, is $2^{-n}$, so for the price of decreasing slightly the constant $\beta$, you may as well assume that $\max g_i$ is essentially a positive random variable. You also need to know that $\mathbb{E}(\max |g_i|)\leq O(\log n)^{1/2}$, which is usually proved first, before the lower bound. – uniquesolution Feb 06 '24 at 12:47