1

enter image description here

I've been really struggling with these 2 questions and was wondering if anyone could give me any help/ advice?

For the first one I've tried some calculations using the law of total probability but don't seem to be getting anywhere.

For the second one, if the prior is uniformly distributed, does that mean the prior odds is 1:1?

Similarly, I am unsure of how to calculate the posterior odds and the rest of the question.

Any advice or tips would be greatly appreciated.

Thank you so much!

J. Bant
  • 561
  • For problem 1: Have you used Baye's rule to compute $P[H_0|X=x]$? – Michael Mar 25 '16 at 22:07
  • This question just asks to find when $P[H_0|X=x]>P[H_0]$. The interesting aspect of the question is that this can be done without knowing $P[H_0]$, it only assumes $P[H_0] \in (0,1)$. – Michael Mar 25 '16 at 22:24
  • I get how we could use the binomial pdf to find P[H0|X=x] = (P[x|H0]*P[H0])/ P[x]. But if we don't know P[H0] how can we find the denominator, P[x]? – J. Bant Mar 25 '16 at 22:44
  • Just keep going and you get some nice cancellations. For example, you need to set up the desired inequality. In other words, you can answer the question that was asked, without explicitly computing $P[H_0|X=x]$. – Michael Mar 25 '16 at 22:57
  • OK, on the LHS I have P[H0|X=x]/P[H0] and want that to be > 1

    I can expand the numerator as in the above comment, and cancel the P[H0]. Then do I divide the top and bottom by P[H0|X=x] to find the denominator is P[H0} + 1/Bayes Factor * P[H1] then substitute in the formulae for Bayes Factor and proceed? Is this the correct approach?

    – J. Bant Mar 25 '16 at 23:09
  • Only issue is I don't see where the numbers come in by using that method. – J. Bant Mar 25 '16 at 23:11
  • I do not see the point of dividing top and bottom by $P[H_0|X=x]$, as that factor has already gone away after using Baye's rule. It may be useful to divide top and bottom by $P[X=x|H_0]$ and then do some further manipulations. – Michael Mar 26 '16 at 02:31
  • sorry to come back to this, but I'm still not really seeing how to do this question. How do I evaluate P[X=x|H0] again? I'm seeing the formula in terms of an integral, but this is a simple hypothesis with a fixed theta? – J. Bant Mar 29 '16 at 14:04
  • For integers $x$, $P[X=x|H_0]$ is the probability of having $x$ successes from a binomial distribution with parameters $n$ and success prob $\theta=1/2$. – Michael Mar 29 '16 at 20:57

1 Answers1

1

Let $p=\Pr(H_0)$ and $q = 1-p = \Pr(H_1)$ be the prior probabilities. The likelihood function is \begin{align} L(H_0\mid X=x) & = \binom n x \frac 1 {2^n} \\[12pt] L(H_1\mid X=x) & = \binom n x \left(\frac 3 4\right)^x \left(\frac 1 4 \right)^{n-x} \end{align} $$ \frac{\Pr(H_0\mid X=x)}{\Pr(H_1\mid X=x)} = \frac{\Pr(H_0)}{\Pr(H_1)} \cdot \frac{L(H_0\mid X=x)}{L(H_1\mid X=x)} = \frac p q \cdot \frac{\dbinom n x \dfrac 1{2^n}}{\dbinom n x \left(\dfrac 3 4 \right)^x \left( \dfrac 1 4 \right)^{n-x}} = \frac p q \cdot \frac{2^n}{3^x}. $$ Then we have $\Pr(H_0\mid X=x)>\Pr(H_1\mid X=x)$ precisely if $\dfrac{2^n p}{3^x q} > 1$.

(The above is all done using odds rather than probablity. It's a bit simpler that way. With probabilities, we would need $\Pr(H_0\mid X=x)+\Pr(H_1\mid X=x)=1$, so we'd need the normalizing constant $c$ for which $c(2^n p + 3^x q) = 1$.)

If $\theta$ is uniformly distributed on $(0,1)$, then $\Pr(H_0) = \Pr(\theta \le 1/2) = 1/2$.

The prior probability distribution of $\theta$ let us denote by saying $\displaystyle \Pr(\theta\in A) = \int_A 1\,dt$ for $A\subseteq (0,1)$; thus the measure is $1\,dt$ on $(0,1)$. The likelihood is $$ L(t\mid X=x) = \binom n x t^x (1-t)^{n-x}. $$ Hence the posterior probability distribution of $\theta$ is $$ c t^x (1-t)^{n-x}\cdot 1\,dt $$ where the normalizing constant $c$ is chosen so that $\displaystyle\int_0^1 c t^x (1-t)^{n-x}\cdot 1\,dt = 1$. Integrating, we get $$ \int_0^1 t^x(1-t)^{n-x}\cdot 1\,dt = \frac 1 {(n+1)\dbinom n x}. $$ For probabilistic method of evaluating this integral, see this answer.

So we have $$ \Pr(H_0\mid X=x) = (n+1)\binom n x \int_0^{1/2} t^x(1-t)^{n-x}\, dt $$ and the posterior odds is $$ \frac{\Pr(H_0\mid X=x)}{\Pr(H_1\mid X=x)} = \frac{(n+1)\binom n x \int_0^{1/2} t^x(1-t)^{n-x}\, dt}{(n+1)\binom n x \int_{1/2}^1 t^x(1-t)^{n-x}\, dt} = \frac{\int_0^{1/2} t^x(1-t)^{n-x}\, dt}{\int_{1/2}^1 t^x(1-t)^{n-x}\, dt}. $$

  • Thank you for the answer, but is this not comparing the posterior of H0 vs the Posterior of H1? Is the question just worded incorrectly? – J. Bant Mar 25 '16 at 22:22
  • The question is not worded incorrectly. This answer just answers a different aspect of the same problem, I think this was intended so that you could solve the problem on your own using similar techniques (via Baye's rule, to start). – Michael Mar 25 '16 at 22:31
  • It says "find an expression for the posterior odds of $H_0$ relative to $H_1$." That is done in this answer. $\qquad$ – Michael Hardy Mar 26 '16 at 01:04