2

MLEs are pretty useful for estimating parameters of probability distributions when they are consistent and asymptotically normal. But I suck too much at math and proofs to prove that the relevant regularity conditions hold. For example, one condition for consistency is that the parameter space be compact. The mean and variance for a normal distribution are not even bounded or closed, yet you can still use MLE for the normal distribution. Is there a comprehensive list of all the distributions for which MLEs have been proven to be consistent and asymptotically normal, just for the dummies like me? I can't for the life of me find one.

2 Answers2

1

Fundamentals of Statistical Signal Processing-Estimation Theory by Steven M. Kay:

Page $167$ Theorem $7.1$: Maximum likelihood estimators satisfying some regularity conditions are asymptotically normally distributed,

$$\hat\theta\sim\mathcal{N}(\theta,I^{-1}(\theta))$$

asymptotically unbiased and they attain the CRLB (Cramer Rao Lower Bound).

Hence, MLEs are asymptotically efficient and optimal. Since they are unbiased asymptotically, they are also consistent.

Regularity conditions:

$1.$ The derivatives of the log-likelihood function exist

$2.$ Fisher information is non-zero

You can also find the same information, at this WIKI article (see the section "Properties")

Compactness argument is necessary, if not you may end up with not finding the MLE since it does not exist. Yes for example $\sigma\in\mathbb{R}^+$, where $\mathbb{R}^+$ is not compact w.r.t. standard topology. But in advance we know that $\sigma<\infty$ and that as $n\to\infty$, $\theta\to\infty$ being the maximizer is impossible, in advance.

  • I kinda see the reasoning, but I still suck too much at math to really understand it, let alone make any use of the regularity conditions to tell if an MLE is consistent and asymptotically normal, lol. – BatWannaBe Jan 07 '18 at 01:08
  • Those conditions are needed for the existence of the test. Nothing so special. Just check the proofs in the book link I gave you. And dont forget if you like the answer, you may think about upvoting or accepting. – Seyhmus Güngören Jan 07 '18 at 08:51
  • You did name a textbook, which according to another answer usually lists commonly encountered distributions for which MLE can be used to bound parameters, so have my upvote and accept. My poor wallet... – BatWannaBe Jan 08 '18 at 03:31
  • lol I just clicked the link, and it was a pdf for the book =P – BatWannaBe Jan 08 '18 at 03:34
  • @BatWannaBe Thank you. Accepting answers will motivate others to write answers to your future questions. The book is a masterpiece. I am sure it will help you alot. Good luck! – Seyhmus Güngören Jan 08 '18 at 09:20
0

I think MLEs are always consistent and asymptotically normal in all finite dimensional exponential families in the iid sampling regime, when the true parameter lies in the interior of the natural parameter space. This will cover many examples in the textbook (including the Gaussian location and scale family) but not (say) the Cauchy location problem, even though consistency and asymptotic normality hold there, too.

Often conditions (such as the compactness condition you complain about) are framed for mathematical convenience or for expositional convenience and not because they exhaust the range of what's actually true.

kimchi lover
  • 24,277