0

$$\underset{n\to \infty }{\mathop{\lim }}\,\int_{0}^{1}{\int_{0}^{1}{\cdots \int_{0}^{1}{{{\left( \frac{{{x}_{1}}+{{x}_{2}}+...+{{x}_{n}}}{n} \right)}^{2}}d{{x}_{1}}d{{x}_{2}}\cdots d{{x}_{n}}}}}$$

I start thinking first in the Lebesgue monotone convergent theorem but this leads to closed road is there any shortcut to solve this problem ??

3 Answers3

4

The integral has an equivalent expression $$ E\left[\left(\frac{1}{n}\sum_{i=1}^n U_i\right)^2\right] $$ where $U_i$, $i\le n$ are independently and uniformly distributed random variables on $[0,1]$. This gives $$\begin{eqnarray} E\left[\left(\frac{1}{n}\sum_{i=1}^n U_i\right)^2\right]&=&\frac{1}{n^2}\sum_{i,j=1}^nE\left[U_iU_j\right]\\ &=&\frac{1}{n^2}\sum_{i=1}^nE\left[U_i^2\right] +\frac{1}{n^2}\sum_{i\ne j}E\left[U_iU_j\right]\\ &=&\frac{n}{n^2}\frac{1}{3}+\frac{n(n-1)}{n^2}\frac{1}{4}\to \frac{1}{4} \end{eqnarray}$$ since $E[U_i]=\int_0^1 xdx=\frac{1}{2}$, $E[U_i^2]=\int_0^1 x^2dx =\frac{1}{3}$ and $E[U_iU_j]=E[U_i]E[U_j]=\frac{1}{2}\cdot\frac{1}{2}=\frac{1}{4}$ for all $i\ne j$.

Myunghyun Song
  • 21,723
  • 2
  • 24
  • 60
3

We have $$(x_1+x_2+\cdots +x_n)^2=\sum_i x_i^2+\sum_{i,j\ne i} x_ix_j$$first of all note that $$\int_{0}^{1}{\int_{0}^{1}{\cdots \int_{0}^{1}{{{x_i}^{2}}d{{x}_{1}}d{{x}_{2}}\cdots d{{x}_{n}}}}}=\int_0^1 x_i^2dx_i={1\over 3}$$and $$\int_{0}^{1}{\int_{0}^{1}{\cdots \int_{0}^{1}{{{x_ix_j}{}}d{{x}_{1}}d{{x}_{2}}\cdots d{{x}_{n}}}}}={1\over 4}$$therefore$$\int_{0}^{1}{\int_{0}^{1}{\cdots \int_{0}^{1}{{{\left( {{{x}_{1}}+{{x}_{2}}+...+{{x}_{n}}}{} \right)}^{2}}d{{x}_{1}}d{{x}_{2}}\cdots d{{x}_{n}}}}}=n\cdot {1\over 3}+(n^2-n){1\over 4}$$and by substitution we obtain $$\int_{0}^{1}{\int_{0}^{1}{\cdots \int_{0}^{1}{{{\left( \frac{{{x}_{1}}+{{x}_{2}}+...+{{x}_{n}}}{n} \right)}^{2}}d{{x}_{1}}d{{x}_{2}}\cdots d{{x}_{n}}}}}={1\over 3n}+{n-1\over 4n}$$which obviously shows that the limit is $1\over 4$.

Mostafa Ayaz
  • 31,924
  • Sorry first line is not clear to if we say $(x+y+z)^2=x^2+y^2+z^2+2(xy+xz+yz)$ by induction you will get the following result , ${{\left( \sum\nolimits_{i=1}^{n}{{{x}{i}}} \right)}^{2}}=\sum\nolimits{i=1}^{n}{x_{i}^{2}+2\sum\nolimits_{i\ne j}^{n}{{{x}{i}}{{x}{j}}}}$ – Ramez Hindi Jan 13 '19 at 21:34
  • In fact you get $(\sum_i x_i)^2=x_1^2+\cdots + x_n^2+2\sum_{i<j}x_ix_j$. The multiplicity has been considered in my equation since $2x_ix_j=x_ix_j+x_jx_i$ whenever $i\ne j$ – Mostafa Ayaz Jan 13 '19 at 21:38
2

Suppose $X_1,X_2,\ldots$ are independent random variables having the uniform distribution on $[0,1]$. Then the common expectation of these variables exist and equals $\mu=1/2$.

Define $$\overline X_n=\frac{1}{n}\sum_{k=1}^n X_k$$

By Khintchine's weak law of large numbers, $$\overline X_n\stackrel{P}{\longrightarrow}\mu\quad\text{ as }\quad n\to\infty$$

And by the continuous mapping theorem, $$\overline X_n^2\stackrel{P}{\longrightarrow}\mu^2\quad\text{ as }\quad n\to\infty\tag{1}$$

Moreover, $$0\le X_1,\ldots,X_n\le 1\implies 0\le \overline X_n\le 1\implies 0\le \overline X_n^2\le 1\tag{2}$$

$(1)$ and $(2)$ together imply $$\int_{[0,1]^n}\left(\frac{x_1+\cdots+x_n}{n}\right)^2\mathrm{d}x_1\ldots\mathrm{d}x_n = E\left(\overline X_n^2\right)\stackrel{n\to\infty}{\longrightarrow}\frac{1}{4}$$

StubbornAtom
  • 17,052
  • It’s always nice to see an easygoing probability solution to what appears to be a monstrous analytic problem. +1 – Nap D. Lover Jan 13 '19 at 20:08
  • there must be some confusion. I upvoted both yours and Song’s answer for the use of probability. – Nap D. Lover Jan 13 '19 at 20:33
  • $X_n\to X$ in probability isn't enough to conclude that $\mathbb{E}[X_n]\to\mathbb{E}[X]$. It would be better to use the strong law of large numbers, plus dominated convergence. – carmichael561 Jan 15 '19 at 00:27
  • @carmichael561 Thank you. What if I add the fact that $|X_n|<1$ to my answer? I think that salvages the argument. – StubbornAtom Jan 15 '19 at 04:59
  • Why not change weak law of large numbers and convergence in probability to strong law of large numbers and almost sure convergence and make your life easy? It certainly applies here. – carmichael561 Jan 16 '19 at 23:35