4

Let $b_1,..b_n$ be real numbers and $\varepsilon_1,...,\varepsilon_n$ be independant Rademacher random variables. The Khintchine's inequality states that $$\mathrm{E}\left [ \left ( \sum_{i=1}^{n} b_i\varepsilon_i \right )^{2p}\right ]\leqslant \frac{\left ( 2p \right )!}{2^pp!}\left ( \sum_{i=1}^{n}b_i^2 \right )^p$$ for every integer $p \geqslant 1$.

I'm trying to prove that the constant $\frac{\left ( 2p \right )!}{2^pp!}$ is optimal, in the sense that it is impossible to obtain an inequality that holds for every Rademacher sum with a strictly smaller constant that does not depend on the dimension $n$.

Since $\frac{\left ( 2p \right )!}{2^pp!}$ is the $2p$-th moment of a standard normal variable, my idea was to approximate a well chosen Rademacher sum with a standard normal variable to obtain the optimality.

Let $b_1=...=b_n=1$. The central limit theorem ensures that $Z_n=\frac{1}{\sqrt{n}}\sum_{i=1}^{n}\varepsilon_i$ converges in distribution towards a random variable $X$ of distribution $\mathcal{N}(0,1)$.

If that implied that $$\lim_{n\rightarrow\infty}\mathrm{E}[Z_n^{2p}] = \mathrm{E}[X^{2p}]$$ then we would have $$\lim_{n\rightarrow\infty}\frac{1}{n^p}\mathrm{E}\left [ \left ( \sum_{i=1}^{n}\varepsilon_i \right )^{2p}\right ] = \frac{\left ( 2p \right )!}{2^pp!}$$ which proves the optimality.

So my question really is : is it true that $\lim_{n\rightarrow\infty}\mathrm{E}[Z_n^{2p}] = \mathrm{E}[X^{2p}]$ ? I don't think the dominated convergence theorem works here since $Z_n$ is not bounded.

The interpretation of convergence in distribution in terms of pointwise convergence of the characteristic functions yields $\forall t \in \mathbb{R}, \lim_{n\rightarrow\infty} \cos(\frac{t}{\sqrt{n}})^n=e^{-\frac{t^2}{2}}$. Could that be of any use ?

1 Answers1

2

Indeed, the dominated convergence theorem does not work directly, since $\sup_n \lvert Z_n\rvert$ is not integrable.

However, we can conclude the wanted convergence if we can show that for each $p\geqslant 1$, $\left(Z_n^{2p}\right)$ is uniformly integrable, see here for the details.

If we show that $\sup_{n\geqslant 1}\mathbb E\left[Y_n^{2p}\right]$ is finite for each $p\geqslant 1$, we will get the uniform integrability. Indeed, if we want to show that $\left(Z_n^{2p_0}\right)$ is uniformly integrable for some $p_0$, we use the fact that $\left(Z_n^{2(p_0+1)}\right)$ is bounded in $\mathbb L^1$.

Now $\mathbb E\left[Y_n^{2p}\right]$ can be estimated by Khintichine inequality and can be bounded independently of $n$.

Alternatively, one can start from $$ \left\lvert\mathbb E\left[Y_n^{2p}\right]-\mathbb E\left[X^{2p}\right]\right\rvert =2p\left\lvert \int_0^\infty t^{2p-1}\left( \mathbb P\left(\left\lvert Y_n\right\rvert >t\right)-\mathbb P\left(\left\lvert X\right\rvert>t\right)\right)dt\right\rvert, $$ split the integral into two parts: on $(0,A)$, which can be bounded using the uniform convergence of the c.d.f. of $Y_n$ to that of $X$; for the integral on $(A,+\infty)$, one can use Hoeffding's inequality to show that its contribution vanishes as $A$ goes to infinity uniformly on $n$.

Davide Giraudo
  • 172,925
  • Got it - do you think there is a more elementary way of getting the convergence of moments in our specific case ? I don't believe I've seen this result before and it looks like it may not be that easy to prove at first glance – backahast Nov 27 '20 at 14:36
  • @backahast I have added an "alternative" proof, at least without dealing with uniform integrability. – Davide Giraudo Nov 28 '20 at 11:19
  • 1
    just a little detail, it should be $ \mathbb P\left(|Y_n|>t\right)-\mathbb P\left(|X|>t\right)$ instead of $ \mathbb P\left(Y_n>t\right)-\mathbb P\left(X>t\right)$, shouldn't it ? It works the same way anyway – backahast Dec 09 '20 at 17:51
  • @backahast You are right, fixed now. – Davide Giraudo May 25 '22 at 09:46