Let's say I'm tossing a coin until I get the first heads. I repeat that experiment n times. Then I'll take the maximum value of tries that were needed to get heads in that experiment (out of the n results obtained). I was expecting that value to be in average $\log_2 n$, but I ran a simulation on matlab and the results didn't quite match it.
The results were always around 1.3 greater than that even using millions of sample sizes. The weird thing is that the histogram of the number of tries needed to get heads for the first time is spot on, so I know that the problem isn't on my code, but on my math. The result link shows the plots I got from running 10000 times the experiment to get the maximum value of tries considering that each time $2^{14}$ values were recorded.
I expected the mean of the max to be 14, but I got it to be on average 15.3 with a standard deviation of 1.89. My question is, how can I get mathematically to this average and standard deviation, and how can I find the expression to plot that curve?
Also, please correct me if I'm wrong but the reason this doesn't follow a normal distribution is due to the fact that the result is limited from [1,inf[, so it is truncated, right?