As far as I understand the Monte Carlo methods from a non-rigourous point of view because unfortunately I didn't study mathematics formally.
For example to find a minimum value of a function $f(x)$ in a given interval $[a,b]$:
- Generate $N$ uniform random numbers in the interval $a\le x_i\lt b\,$ for $i=1, \cdots N$ .
- Evaluate $f(x_i)$ for $i=1, \cdots N$.
- The minimum value, $m$, of $f(x)$ in the given interval is: $m= \text{min}(f(x_1),f(x_2), \cdots, f(x_N))$.
For $N$ sufficiently large we get the approximate value of $m$. if we fix $N$ and run the algorithm $n$ times the values $m_k$ tend to follow a normal distribution for $k = 1, \cdots, n$
But in the following example I get only half of the normal distribution $$f(x) = (x-3)^2+5$$ in the interval $[1, 8]$, with $N=1000$ and $n=10000$ I get something like this: histogram of m
Could you tell me mathematically which cases we do not get normal distribution of $m_k$ values but only the half normal distribution as shown in the histogram above? (By the halves, I mean the the two parts of the histogram/normal distribution graph symmetric about the mean value.
Edit: Please notice that $n \neq N$. $N$ is the number of random numbers while $n$ is the number of times running the algorithm for a fixed $N$.
Thank you