10

I'm trying to answer the following question from the book high-dimensional probability:

Let $X_1,X_2,\dots$ be a sequence of sub-gaussian random variables, which are not necessarily independent. Show that

$E\bigg[ \max_i \frac{|X_i|}{\sqrt{1 + \log i}} \bigg] \le CK$,

where $K = \max_i \|X_i\|_{\psi_2}$. Deduce that for ever $N \ge 2$ we have

$E\bigg[ \max_{i \le N} |X_i| \bigg] \le CK \sqrt{\log N}$.

I've tried to figure out what is the distribution of the maximum of Gaussians, but I'm reaching only inequalities that that don't help me answer the question.

I've also seen a similar question here.

Does anyone have a clue or something to start with in order to answer this question?

Thanks!

imu96
  • 556

3 Answers3

4

You can use this idea as a start (it is actually more that a start!) Without loss of generality, assume that $K = c$ (the constant in the exponent of subgaussian tail).

\begin{eqnarray} \mathbb{E}\max \frac{|X_i|}{\sqrt{1+\log i}} &=& \int_0^\infty \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt\\ &\leq& \int_0^2 \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt + \int_2^\infty \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\&\leq& 2 + \int_2^\infty \sum_{i=1}^N\mathbb{P}\left( \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\ &\leq& 2 + \int_2^\infty \sum_{i=1}^N 2 \exp\big(-\frac{ct^2(1+\log(i))}{K^2} \big) dt\\ &\leq& 2 + 2\sum_{i=1}^N \int_2^\infty \exp(-ct^2/K^2) \;\;i^{-t^2} dt \\ &\leq& 2 + 2\sqrt{2\pi }K\sum_{i=1}^N \int_2^\infty \frac{1}{\sqrt{2\pi }K}\exp(-\frac{ct^2}{K}) \;\;i^{-4} dt \leq \infty \end{eqnarray} We know that the sum of $\frac{1}{i^4}$ in convergent.

I choose 2 as the point to split two integrals to make the sum convergent. (you could have used other points).

Daniel Li
  • 3,200
3

The above answer seems not correct (although some interesting idea in it). Here is how I solved it.

Consider $Z_i=\frac{|X_i|}{K\sqrt{1+\log{i}}}, i=1,2,...$. We want to show $\mathbb{E}[\max_i Z_i]<C$ for some $C$.

Then we look at event $\Omega_i:=\{Z_i\ge a\}$. Show that $\mathbb{P}(\Omega_i)\le 2(\frac{1}{i})^{-a^2}$ using $|X_i|$ being subgaussian and $K$ the largest subgaussian norm.

Then for choice of some large $a$ that makes $2(\frac{1}{i})^{-a^2}$ summable, we can see from Borel Cantelli, which does NOT require $\{\Omega_i\}$ to be independent events, that $\mathbb{P}(\limsup \Omega_i)=0$. This means, with probability 1, there exists N, such that for all $i>N,$ $Z_i<a$. Then, $\mathbb{E}[\max_i Z_i]\le \mathbb{E}[\max_{i\le N} Z_i]+\mathbb{E}[\max_{i> N} Z_i]\le \mathbb{E}[\sum_{i=1}^N Z_i]+a= \sum_{i=1}^N\mathbb{E}[ Z_i]+a\le N\cdot \max_{i\le N}\mathbb{E}Z_i+a<\infty.$

Daniel Li
  • 3,200
1

The answer using Borel Cantelli seems not correct since that $\mathbb{P}(\lim\sup \Omega_i) = 0$ does not mean with probability 1, there exists N such that for all $i>N, Z_i < a$.

Actually, we can adapt the solution by Behrad Moniri and split the integration at point $\frac{2}{\sqrt{c}}K$. Besides we bound the integration of Gaussian tail $\int_{\frac{2}{\sqrt{c}}K}^\infty\exp\{-\frac{c(1+\log i)t^2}{K^2}\}$d$t$ by using a transformation and the inequality for standard normal $\Phi(z) \leq \frac{1}{z}\frac{1}{\sqrt{2\pi}}\exp\{-z^2/2\}$.

At last using the convergence of $\sum\frac{1}{i^2}$, we can show that the expectation can be bounded by $CK$.