I've been reading a text on probability theory and it thoroughly looked at the distribution of the sum of i.i.d. random variables but on the last page, they made a comment about the distribution of the product of i.id. random variables that confused me:
They said that if you had i.i.d. random variables $Y_1,Y_2,...$ that are uniformly distributed on (0,1), then the distribution of $Y_1Y_2\cdots Y_n$ is approximately $e^{X}$, where X is normal with mean $n\mu$ and standard deviation $\sigma\sqrt{n}$ (for some $\sigma>0$ and real $\mu$. Can someone please explain why this is?