Can you please help verify if what I have done is correct for the question below? I applied Chebyshev's theorem in the first step, but I am worried if there are any mathematical errors or misapplied theorems in my solution. Thanks for the help.
Asked
Active
Viewed 1,620 times
2
-
1$\ni^2$ is definitely a new one for me... – cardinal Oct 17 '11 at 23:36
-
All of the $\ni$'s should probably have been $\varepsilon$. – hmakholm left over Monica Oct 17 '11 at 23:45
-
I don't know why you have $-1$ in the numerator after $\operatorname{var}(S_n)$. All you really need to know about $\operatorname{var}(S_n)$ is that it is finite. That can be deduced from the fact that $E(X^4)<\infty$. You seem to have done everything else. – Michael Hardy Oct 18 '11 at 00:48
-
This application of Chebyshev doesn't quite work since the mean of $S_n/\sigma$ is not equal to 1. You should follow Mike's lead below, and consider $S_n^2/\sigma^2$ instead. Then you can use the result in http://math.stackexchange.com/questions/72975/variance-of-sample-variance/73080#73080 to conclude. – Oct 18 '11 at 01:33
-
Is Var (Sn) = sigma ^ 2 / root (n)? Otherwise I don't know how to make the limit 0? – MathMan Oct 18 '11 at 01:35
-
@ Byron Why does the mean of Sn / sigma not equal 1? Isnt E(Sn) = sigma? – MathMan Oct 18 '11 at 01:40
-
1$S_n^2$ is an unbiased estimator for $\sigma^2$. $S_n$ is not an unbiased estimator for $\sigma$. By the way, you make the variance go to zero using http://math.stackexchange.com/questions/72975/variance-of-sample-variance/73080#73080 – Oct 18 '11 at 02:00
1 Answers
1
One way to do this is show that $\frac{S_n^2}{\sigma^2}$ converges in probability to 1 using the same method you tried (invoking Chebyshev's Inequality), and then noting (or proving if you didn't know already) that if $X_1, X_2, \ldots$ converge in probability to $X$ and $g$ is a continuous function, then $g(X_{1}), g(X_{2}), \ldots$ converge in probability to $g(X)$.