3

I am wondering if someone could help me shed light on why the following bound doesn't work.

Suppose that $X_1, \ldots, X_n \sim N(0,1)$ are independent random variables. I am interested in finding a constant C that satisfies:

$$ E\left[\max_{1\leq i\leq n}|X_i|\right] \leq C \sqrt{log\ n} $$

My Method:

Let $Y = \max_{1\leq i\leq n}|X_i|$

$$ \begin{align} E\left[Y\right] & = \int_{0}^{\infty}P(Y >y)dy \\ &\leq\int_{0}^{\infty}\frac{1}{y^2}\sum_{i=1}^{n}Var(X_i) dy \\ & \leq \int_{0}^{\infty}\frac{n}{y^2}dy \\ \end{align} $$

I used Kolmogorov's inequality in the second step above, but don't know the exact mechanics of why this fails and am unsure why my integral in the end doesn't converge. Is there a specific reason this bounding doesn't work? Am I failing to take into account tail activity? thanks.

user321627
  • 2,584
  • It's even worse, because you left out a factor $n$ from the last line. If you take into account the fact that $P(Y>y) \le 1$, you can get a bound of $2 \sqrt{n}$ from this inequality. It seems you need something stronger than Kolmogorov to get $C \sqrt{\log n}$. – Robert Israel Sep 19 '16 at 22:29
  • Can you elaborate how you got the bound of $2 \sqrt{n}$? thanks! – user321627 Sep 19 '16 at 22:37
  • I don't see a problem with your inequalities, though they are not as tight as desired. Filling in the gaps it looks like the reasoning is (for $y>0$): $P[\max_i |X_i|>y] = P[\max_i X_i^2 > y^2] \leq P[\sum_i X_i^2 > y^2] \leq E[\sum_i X_i^2]/y^2 = n/y^2$. You can make your integral converge (and hence be $O(n)$) by writing $E[Y]=\underbrace{\int_0^1 P[Y>y]dy}{\leq 1}+ \int_1^{\infty} \underbrace{P[Y>y]}{\leq n/y^2} dy$. Notice that this argument does not use independence anywhere. – Michael Sep 19 '16 at 23:06
  • With independence, note that $P[Y>y] = 1-F_X(y)^n$ where $F_X(y) = P[X\leq y]$. – Michael Sep 19 '16 at 23:09
  • @Michael Thank you! Is there an explanation as to why my inequalities are not as tight? I am trying to see why sometimes an inequality works like a charm but other times it too wide. Thanks! – user321627 Sep 19 '16 at 23:35
  • 1
    For $y < \sqrt{n}$, use $P(Y>y) \le 1$. For $y \ge \sqrt{n}$, use Kolmogorov. Resulting bound $\sqrt{n} + \int_{\sqrt{n}}^\infty \frac{n}{y^2}; dy = 2 \sqrt{n}$. – Robert Israel Sep 20 '16 at 00:50
  • The $O(\sqrt{n})$ bound is the best possible over all random variables (possibly dependent) with second moment 1, since you can consider ${X_1, ..., X_n}$ where one is randomly selected to be $\sqrt{n}$ (uniformly over all indices ${1, ..., n}$) and the rest are 0. – Michael Sep 20 '16 at 00:56
  • @Michael Thanks, is there a difference between what I have above (with the absolute value signs around each $X_i$ vs. if I didn't? Thanks! – user321627 Sep 20 '16 at 00:57
  • The bound is hte same, for $y>0$ we get $P[\max_i X_i >y] \leq P[\max_i |X_i|>y] \leq n/y^2$. – Michael Sep 20 '16 at 01:01

0 Answers0