0

The following is Exercise 2.5.10 from the book High Dimensional Probability by Vershynin:

Let $X_1, X_2,\dots$ be an infinite sequence of sub-gaussian random variables which are not necessarily independent. Show that $$ \mathbb {E}\text{max}_{i}\frac{|X_i|}{\sqrt{1+\log i}}\leq CK, $$ where $K=\text{max}_{i}\|X_i\|_{\psi_{2}}$. Deduce that for every $N\geq 2$ we have $$ \mathbb {E}\text{max}_{i\leq N}|X_i|\leq CK \sqrt{\log N}. $$

This question has been asked before here, and the first answer is pretty much what I came up with myself (with the only difference being the way I split the integral). However, after trying to validate my solution, I found the following issue: Let $c_i$ be such that $$\mathbb{P}(|X_i|\geq t) \leq 2\exp(-c_i t^2 / \|X\|_{\psi_2}^2) \leq 2\exp(-c_i t^2 / K^2).$$ When applying the union bound (inside the integral) we have $$\mathbb{P}\left(\text{max}_{i}\frac{|X_i|}{\sqrt{1+\log i}} \geq t \right) \leq \sum_{i=1}^\infty \mathbb{P}(|X_i|\geq t\sqrt{1+\log i}) \leq 2\exp(-c_i t^2 (1+\log i)/K^2)$$ To get a uniform bound (like in the other answer) we can take $c=\inf_{i}c_i$. If $c>0$ then everything works out but what happens when $c=0$? How to handle this case? Or is there a way to argue that $c>0$?

0 Answers0