I have this very simple homework problem:
Let $X_1,\ldots,X_n$ be a random sample from the $\operatorname{Bernoulli}(p)$ distribution. Find the Fisher information $I(p)$.
I freely admit that my problem is I don't comprehend what a Fisher Information is. (Incidentally, is there a difference between Fisher Information and Fisher Information Matrix?) This is what I have thus far.
Since a Bernoulli Distribution is a Binomial Distribution, the samples are independent and individual. Since the likelihood function is $\mathbb{L} = \prod_{i=1}^n f(X_i\mid\theta)$ and the log likelihood is $\ell = \log(\mathbb{L}(\theta))$ this yields $I(\theta) = \mathbb{E}\{\mathbb{\ell}\,'(\theta)\mathbb{\ell}\,'(\theta)^T\} = -\mathbb{E}(\mathbb{\ell}\,''(\theta))$.
However, I do not know if I'm on the right path. I'd appreciate insight.
Thanks,
Andy