I am looking for a reference to cite, for the following "folklore" asymptotic behaviour of the maximum of $n$ independent Gaussian real-valued random variables $X_1,\dots, X_n\sim \mathcal{N}(0,\sigma)$ (mean $0$ and variance $\sigma^2$): $$ \mathbb{E} \max_i X_i = \sigma\left(\tau\sqrt{\log n}+\Theta(1)\right) $$ (where, if I'm not mistaken, $\tau=\sqrt{2}$). I've been pointed to a reference book of Ledoux and Talagrand, but I can't see the satement "out-of-the-box" there -- only results that help to derive it.
Asked
Active
Viewed 1,933 times
7
-
2Actually $E(M_n)\leqslant\sigma\sqrt{2\log n}$ is a one-line computation. – Did Oct 25 '14 at 15:31
-
Mmh, I just realized my question was slightly wrong, as phrased. I should have written $\Theta(1)$ instead of $O(1)$ (for both upper and lower bound). – Clement C. Oct 25 '14 at 15:36
2 Answers
5
I eventally found these two references:
- from [1]: the expected value of the maximum of $N$ independent standard Gaussians: Theorem 2.5 and Exercise 2.17, p. 49; for a concentration result, combined with the variance (which is $O(1)$). Exercise 3.24 (or Theorem 5.8 for directly a concentration inequality).
- from [2], Theorem 3.12
[1] Concentration Inequalities: A Nonasymptotic Theory of Independence By Stéphane Boucheron, Gábor Lugosi, Pascal Massart (2013)
[2] Concentration Inequalities and Model Selection, by Pascal Massart (2003)

Clement C.
- 67,323
-
@Chill2Macht How so? Look e.g. at the discussion after the proof (specifically equation (3.26)) in [2]; or as mentioned above Exercise 2.17 in [1]. – Clement C. Feb 18 '18 at 02:10
-
The discussion near 3.26 [2] is helpful, but one has to prove first that the max is sub-Gaussian before being able to use it. Thankfully theorem 5.8 from [1] gives a proof of that, which does help. I have to admit that when I complained I didn't actually bother looking at Theorem 5.8 because I didn't see how it could help with this problem, since the earlier parts of [1] mentioned did not seem very helpful. – Chill2Macht Feb 18 '18 at 22:35
-
Specifically, in [1], the exercises mentioned just state the results that need to be proved, rather than giving any indication as to how they might be proved. (Actually I think Exercise 2.17 gives a reference to a book which is now out of print as a hint, which isn't very helpful. That is of course more a complaint with the textbook authors about their book than about this answer though, which I still upvoted btw.) – Chill2Macht Feb 18 '18 at 22:36
-
Actually even theorem 5.8 in [1] isn't helpful, because it relies on theorem 5.4, which relies on theorem 5.1, which makes an obscure remark about "we can assume the Lipschitz function is differentiable because of a standard limiting argument" without giving the standard limiting argument. Same thing here https://terrytao.wordpress.com/2010/01/03/254a-notes-1-concentration-of-measure/ and here https://galton.uchicago.edu/~lalley/Courses/386/Concentration.pdf Frankly I'm starting to doubt the result is even true since no one's actually ever bothered to prove it. – Chill2Macht Feb 18 '18 at 23:38
-
@Chill2Macht I don't understand what you mean. If your problem is the lower bound, this is exactly what "The discussion near 3.26 [2]" (and more specifically (3.25) along with the asymptotics of $\bar{\Phi}^{-1}$) provide. What is the point that remains problematic? (As in, the whole argument is made between the end of p.65 and p.66, and it specifically deals with Gaussian variables -- not subgaussian or anything else) – Clement C. Feb 19 '18 at 00:06
-
One issue is that no proof or source of the asymptotics of $\Phi^{-1}$ are given. Ancillary to that, the discussion near 3.26 and 3.25 ($|Med[Z] - \mathbb{E}[Z]|$) relies on the truth of theorem 3.12, which is based on equation (3.4) in an essential way, which is in turn proved not from first principles but using the Gaussian logarithmic Sobolev inequality and/or the Gaussian isoperimetric theorem, the first of which has a long proof in the book and the second which isn't proved at all. But exercise 2.17 [1] wants us to believe that it can be proved without logarithmic Sobolev inequalities. – Chill2Macht Feb 19 '18 at 00:38
-
Obviously the biggest issue is that I am not smart enough to understand this. It doesn't help that the only concentration inequalities I have learned are Hoeffding, Bernstein, Bennett, bounded differences, and Efron-Stein. This page https://galton.uchicago.edu/~lalley/Courses/386/Concentration.pdf again (see corollary 2.1) would have one believe that the limit in probability of the bounded differences inequality as applied to Rademachers somehow implies that the maximum is subgaussian(1), thus giving the tail bounds used to the result in [2]. – Chill2Macht Feb 19 '18 at 00:42
-
I don;t think it has anything to do with being "smart enough." The proofs given rely on previous elements either established earlier in the book or assumed to be known (like the asymptotics). (Though that of [2] is a bit more self-contained, I'd say.)There may very well be a self-contained proof out there, though. Maybe. – Clement C. Feb 19 '18 at 00:45
-
1@Chill2Macht Starting with 1.: No. Why would you have a $\sqrt{n}$ in the denominator, while the first expression has a $\sqrt{2\log n}$? – Clement C. Sep 21 '18 at 01:44
-
1@Chilll2Macht i think the best way to proceed is by asking a new question, with a link to this one possibly. – Clement C. Sep 21 '18 at 02:00
-
1https://stats.stackexchange.com/questions/367942/are-these-consequences-of-almost-sure-convergence/368248#368248 It seems I was wrong on both counts. – Chill2Macht Sep 22 '18 at 23:13