Suppose $X_1,X_2,\ldots$ are independent random variables having the uniform distribution on $[0,1]$. Then the common expectation of these variables exist and equals $\mu=1/2$.
Define $$\overline X_n=\frac{1}{n}\sum_{k=1}^n X_k$$
By Khintchine's weak law of large numbers, $$\overline X_n\stackrel{P}{\longrightarrow}\mu\quad\text{ as }\quad n\to\infty$$
And by the continuous mapping theorem, $$\overline X_n^2\stackrel{P}{\longrightarrow}\mu^2\quad\text{ as }\quad n\to\infty\tag{1}$$
Moreover, $$0\le X_1,\ldots,X_n\le 1\implies 0\le \overline X_n\le 1\implies 0\le \overline X_n^2\le 1\tag{2}$$
$(1)$ and $(2)$ together imply $$\int_{[0,1]^n}\left(\frac{x_1+\cdots+x_n}{n}\right)^2\mathrm{d}x_1\ldots\mathrm{d}x_n = E\left(\overline X_n^2\right)\stackrel{n\to\infty}{\longrightarrow}\frac{1}{4}$$