4

Assume that $X_{1,1}, \dots , X_{1,n}, X_{2,1},\dots, X_{2,n}, \dots ,X_{n,1}, \dots , X_{n,n}$ are i.i.d. random variables, and that $\mathbb EX_{i,j}$ exists and is finite. From the strong law of large numbers we have $$\max_i\left\{\lim_{n\to\infty}\left\{\frac{1}{n}\sum_{j=1}^{n}X_{i,j}\right\}\right\} \overset{a.s.}{=} \mathbb EX_{i,j} ,$$ because $\max_i$ is redundant. However, do we have $$\lim_{n\to\infty}\left\{\max_i\left\{\frac{1}{n}\sum_{j=1}^{n}X_{i,j}\right\}\right\} \overset{a.s.}{=} \mathbb EX_{i,j} ? $$ I.e., can we change the order of $\max$ and $\lim$ without affecting the result?

  • And your thoughts on this are? – Did Jul 13 '15 at 11:53
  • I'm not sure why 3 have voted to close. Regarding my thoughts, I think the answer is yes, but I have no idea how to prove it. Perhaps it is just obvious, I'm not sure. – David Simmons Jul 13 '15 at 17:07
  • @David K I'm not following your logic. You see that the right hand side is just the left hand side with the limit and maximum interchanged, right? – David Simmons Jul 13 '15 at 22:08
  • I understand what the expectation operator is - if $X_{i,j}$ has density function $f_X(x)$, $\mathbb E [X_{i,j}] = \int xf_{X}(x)dx$. – David Simmons Jul 14 '15 at 07:27
  • I also understand that the strong law of large numbers states that $\lim_{n\to\infty}1/n(X_{i,1}+\cdots +X_{i,n}) \overset{a.s.}{=}\mathbb E [X_{i,j}]$. If we change the order of the $\lim$ and $\max$ operators in the LHS of my equation in the question, the law of large numbers can be applied immediately, which means that the $\max$ operator is redundant. Like I state in the question, my question can be rephrased as, can we change the order of $\lim$ and $\max$? – David Simmons Jul 14 '15 at 07:32
  • I did mean that, yes. – David Simmons Jul 14 '15 at 07:32
  • @David K On reanalysis of my question, I realise that I did not make it clear that the i.i.d. nature of the RVs applies over all $i$ also. This has been edited. – David Simmons Jul 14 '15 at 07:37
  • 2
    OK, now I get it (I think). You mean $\stackrel{a.s.}=$ where you write $=$. So I feel a little dense now. Good question. (Might be worth editing the $\stackrel{a.s.}=$ and a bit of your comment about the strong LOLN into the question itself, just to make it a bit more obvious.) – David K Jul 14 '15 at 12:41
  • @David K, thanks for your input, I've made the changes. – David Simmons Jul 14 '15 at 13:27
  • If with $\underset{i}{\max}$ you mean to take the max for all $i\in\mathbb{N}$, I'd guess no. The probability of an $i\in\mathbb{N}$ existing such that $$\sum_{j\in\mathbb{N}}X_{i,j} = n\cdot(\mu+\epsilon)$$ for some $\epsilon>0$ is 1. – Nearoo Dec 10 '20 at 17:48
  • Sorry, I meant $j\in[n]$. – Nearoo Dec 10 '20 at 17:56
  • In general, you can't swap a maximum with a limit. For the equality to hold, you need an estimate of $\max_{i\leq n} \frac{1}{n}\sum_{j=1}^n X_{i,j} - \mathbb{E}[X_{1,1}]$. Then you need to see if this goes to $0$ with $n$. Assuming $X_{i,j}$ have finite variance, you can apply the central limit theorem and the second answer to this question: https://math.stackexchange.com/questions/89030/expectation-of-the-maximum-of-gaussian-random-variables, to estimate that the expression does converge to 0. Sorry this isn't a full answer, I don't have time to write it out. – forgottenarrow Dec 11 '20 at 02:14

1 Answers1

0

IF the all moments exist, then the answer is as follows.

Without loss of generality (and by doing and affine transformation $X\rightarrow aX+b$), I assume $$\mathbb{E}[X]=0; \hspace{3mm}\mathbb{E}[X^2]=1$$ Then, the moment generating function $$g(s)\equiv \log\mathbb{E}[e^{sX}]$$ would satisfy $$g(0)=0; \hspace{3mm}g^\prime(0)=0; \hspace{3mm}g^{\prime\prime}(0)=1$$ For each $i$, the variate $Y_i\equiv\frac1n\sum_jX_{ij}$ has $$f(s)\equiv\log\mathbb{E}[e^{sY}]=ng(s/n)=\frac{s^2}{2n}+\mathcal{O}(\frac{1}{n^2})$$ Therefore the density for $Y$ is $$p(y)=\sqrt{\frac{n}{2\pi}}e^{-\frac{ny^2}{2}}+\frac{e_n(y)}{n^2}$$ For some finite function $e_n(y)$. Now the question is: What is the asymptotic behavior of $$Z\equiv\max_{i\in[n]}Y_i$$ Let's start by the cumulative distribution function $$F_Z(z)=F_Y^n(z)=\Big(F_G(z\sqrt n)+\frac{e_n(z)}{n^2}\Big)^n$$ $$= F_G^n(z\sqrt n)+\frac{e_n(z)}{n}+\mathcal{O}(\frac{1}{n^2})$$ where $F_G$ is the cumulative distribution for a standard gaussian. This approximation holds for all real $z$ unless $|z|\ll \frac{1}{\sqrt{n}}$.

Next, let us try to calculate the moments for $Z$ using known results about the maximum of a set of independent gaussian variables. $$\mathbb{E}[Z]=\int z F_Z^\prime(z)dz=\frac{C\log n}{\sqrt{n}}+\mathcal{O}(\frac{1}{n})$$ $$\mathbb{E}[Z^2]=\int z^2 F_Z^\prime(z)dz=\frac{C\log^2 n}{n}+\mathcal{O}(\frac{1}{n})$$ These mean $$\boxed{Z\overset{a.s.}{=}0}$$ There are (small) gaps in my solution but the bounty is due, I had to hurry!

K. Sadri
  • 929