17

The Fisher information is defined as $\mathbb{E}\Bigg( \frac{d \log f(p,x)}{dp} \Bigg)^2$, where $f(p,x)={{n}\choose{x}} p^x (1-p)^{n-x}$ for a Binomial distribution. The derivative of the log-likelihood function is $L'(p,x) = \frac{x}{p} - \frac{n-x}{1-p}$. Now, to get the Fisher infomation we need to square it and take the expectation.

First, we know, that $\mathbb{E}X^2$ for $X \sim Bin(n,p)$ is $ n^2p^2 +np(1-p)$. Let's first focus on on the content of the paratheses.

$$ \begin{align} \Bigg( \frac{x}{p} - \frac{n-x}{1-p} \Bigg)^2&=\frac{x^2-2nxp+n^2p^2}{p^2(1-p)^2} \end{align} $$

No mistake so far (I hope!).

\begin{align} \mathbb{E}\Bigg( \frac{x}{p} - \frac{n-x}{1-p} \Bigg)^2 &= \sum_{x=0}^n \Bigg( \frac{x}{p} - \frac{n-x}{1-p} \Bigg)^2 {{n}\choose{x}} p^x (1-p)^{n-x} \\ &=\sum_{x=0}^n \Bigg( \frac{x^2-2nxp+n^2p^2}{p^2(1-p)^2} \Bigg) {{n}\choose{x}} p^x (1-p)^{n-x} \\ &= \frac{n^2p^2+np(1-p)-2n^2p^2+n^2p^2}{p^2(1-p)^2}\\ &=\frac{n}{p(1-p)} \end{align}

The result should be $\frac{1}{p(1-p)} $ but I've been staring at this for a few hours incapable of getting a different answer. Please let me know whether I'm making any arithmetic mistakes.

Ryan
  • 1
shimee
  • 1,467
  • 1
    Who told you the result does not depend on $n$? This is absurd. – Did May 21 '13 at 06:39
  • Actually, the problem was dealing with limiting distribution of a $Bernoulli(p)$ random sample. $\sqrt{n}(\frac{1}{n}\sum X_i - p) \sim \mathcal{N}(0,p(1-p))$ – shimee May 21 '13 at 07:01
  • Since I had previously studied that the limiting distributions are $ \sim \mathcal{N}(0,\frac{1}{J(p)} )$, where $J(p)$ is the Fisher info, I thought that (since sum of Bernoulli $\sim $ Binomial) I could compute the FI of Bin. But apparently I would need to multiply it by $n$ to get the correct result. Does my reasoning make sense? – shimee May 21 '13 at 07:04
  • 1
    You might be overlooking the fact that if $X$ is Bin$(n,p)$, then $X$ DOES NOT converge to a gaussian, rather $(X-n)/\sqrt{n}$ does--hence there is a normalizing factor $1/\sqrt{n}$. – Did May 21 '13 at 12:47
  • In the case of a Bernoulli, which is binomial(1,p) just replace n=1, job done. – conjectures Aug 26 '13 at 07:38

5 Answers5

5

So, you have $X$ ~ Binomial($n$, $p$), with pmf $f(x)$:

enter image description here

You seek the Fisher Information on parameter $p$. Here is a quick check using mathStatica's FisherInformation function:

enter image description here

which is what you got :)

wolfies
  • 5,174
3

Fisher information: $I_n(p) = nI(p)$, and $I(p)=-\mathbb{E_p}\Bigg( \frac{\partial^2 \log f(p,x)}{\partial p^2} \Bigg)$, where $f(p,x)={{1}\choose{x}} p^x (1-p)^{1-x}$ for a Binomial distribution. We start with $n=1$ as single trial to calculate $I(p)$, then get $I_n(p)$.

$\log f(p,x) = x \log p + (1-x) \log p$

$\frac {\partial \log f(p,X)}{\partial p} = \frac {X}{p} - \frac {1- X}{1 - p}$

$\frac {\partial^2 \log f(p,X)}{\partial p^2} = -\frac {X}{p^2} - \frac {1- X}{(1 - p)^2}$

$I(P) = -\mathbb{E_p}\Bigg( \frac{\partial^2 \log f(p,x)}{\partial p^2} \Bigg) = -\mathbb{E_p}\Bigg(-\frac {X}{p^2} - \frac {1- X}{(1 - p)^2}\Bigg) = \frac {p}{p^2} + \frac {1-p}{(1-p)^2} = \frac {1}{p} + \frac {1}{(1-p)} = \frac {1}{p(1-p)} $

As a result, $I_n(p) = n I(p) = \frac {n}{p(1-p)} $

ngovanmao
  • 121
  • 1
  • 5
0

The result does not dependent on $n$ in the asymptotic information matrix. \begin{align*} \mathcal{I}\left(p\right)&=\underset{n\to\infty}{\mathrm{plim}}\dfrac{1}{n}\dfrac{n}{p\left(1-p\right)}\\&=\dfrac{1}{p\left(1-p\right)} \end{align*}

Ryan
  • 1
0

I know this is well beyond time for the OP, but I have incurred into an analogous issue today and I would like to point out the source of confusion.

The Fisher information for a single Bernoulli trial is $\frac{1}{p(1-p)}$. When you have $n$ trial, the asymptotic variance indeed becomes $\frac{p(1-p)}{n}$.

When you consider the Binomial resulting from the sum of the $n$ Bernoulli trials, you have the Fisher information that (as the OP shows) is $\frac{n}{p(1-p)}$. The point is that when you consider your variable as a Binomial you only have a sample of 1 -- since you observed only 1 binomial outcome. So that when you apply the classic result about the asymptotic distribution of the MLE, you have that the variance is simply the inverse of the Fisher information: $\frac{p(1-p)}{n}$ .

Therefore, the asymptotic variances coincide from both perspectives.

non87
  • 11
0

You are correct.

In this case it is easier to find FI as -E d^2 {log f(x|p)}/dp^2.

Brani
  • 1
  • 8
    Welcome to Math.SE, Brani! Your Answer would be more useful with a little expansion on the development of that formula to the value in the Question. For some information about the MathJax mechanism used here to write formulas with LaTeX, see here. – hardmath Jan 27 '14 at 03:19