5

I'm currently taking a probability course, and in lecture, my professor went over an example which he called Laplace's method of succession. Basically, there are $n+1$ cards, $k$ of which are successes (the $k$ is uniformly distributed). A $k$ is chosen, and you see $n$ of the cards (after shuffling). The problem is to determine the probability the next trial turns up a success. Here's an account of how he explained the problem:

There are two possibilities, either there are $k$ successes in the $n+1$ trials or $k+1$ successes. The probability of each $k$ is uniformly distributed, so we will give $P(k \text{ successes})=P(k+1 \text{ successes})=\frac{1}{2}$. So, we can write the desired probability as $P(1\ |\ k \text{ successes in }n\text{ trials})$ (where $1$ is a success on the next draw and $0$ is failure). We use the definition of conditional probability to get that \begin{equation} P(1 \ | k \text{ successes in }n\text{ trials})=\frac{P(1 \text{ and }k\text{ successes in }n\text{ trials})}{P(k\text{ successes in }n\text{ trials})} \end{equation} Again, using the definition of conditional probabilities and Bayes' rule, that is equal to \begin{equation} \frac{P(1)P(k\text{ successes in }n\text{ trials}\,|\, 1)}{P(1)P(k\text{ successes in }n\text{ trials}\ |\ 1)+P(0)P(k\text{ successes in }n\text{ trials}\ | \ 0)} \end{equation}

Now, he said that this expression is: \begin{equation} \frac{\frac{1}{2}\frac{k+1}{n+1}}{\frac{1}{2}\frac{k+1}{n+1}+\frac{1}{2}\frac{n+1-k}{n+1}}=\frac{k+1}{n+2} \end{equation}

My question is how did he get $\frac{k+1}{n+1}$ and $\frac{n+1-k}{n+1}$ as the answers to the above conditional probabilities? They seem to only denote the chance of drawing success or failure from $n+1$ cards.

Furthermore, today in class, he proved that the function which takes the number of black balls (given one black and one white to start) drawn from Polya's urn (sampling with double replacement) is evenly distributed, and said it had a connection to this problem, but did not elaborate. How is that?

  • Did you type the TeX code in your posting yourself, or was it somehow computer-generated? If the latter, can you tell me what software you used? – Michael Hardy Jan 25 '12 at 20:22
  • 2
    You must have meant $\frac{\frac{1}{2} \frac{k+1}{n+1}}{\frac{1}{2}\frac{k+1}{n+1}+\frac{1}{2}\frac{n+1-k}{n+1}}=\frac{k+1}{n+2}$. – Michael Hardy Jan 25 '12 at 20:26
  • I did type it myself, and yes, that is what I meant. Thanks for pointing that out! – John Lee Jan 25 '12 at 23:21

1 Answers1

4

I think the example being discussed can be restated as follows: to start with, there are $n+1$ cards, $S$ of which are successes, where $S$ is chosen uniformly at random in $\{0,\ldots,n+1\}$. We then shuffle and draw $n$ cards, $k$ of which are successes. What is the probability that the next card is a success?

Let $O$ be our observation that $k$ successes were drawn out of $n$. Given $O$, we must have $S\in\{k,k+1\}$, and the next card will be a success iff $S=k+1$. Then by Bayes' Theorem, $$ {\Bbb P}(S=k+1|O)=\frac{{\Bbb P}(S=k+1\ \hbox{and}\ O)}{{\Bbb P}(O)}$$ $$ =\frac{{\Bbb P}(O|S=k+1){\Bbb P}(S=k+1)}{{\Bbb P}(O|S=k+1){\Bbb P}(S=k+1)+{\Bbb P}(O|S=k){\Bbb P}(S=k)}. $$ Then, since ${\Bbb P}(S=k)={\Bbb P}(S=k+1)=\frac{1}{n+2}$, $$ {\Bbb P}(S=k+1|O)=\frac{P(O|S=k+1)}{P(O|S=k+1)+P(O|S=k)}.\qquad(1) $$ Now, if $S=k$, we will draw $k$ success cards out of the first $n$ draws just when the last, undrawn, card is a failure. This will happen with probability $\frac{n+1-k}{n+1}$. Therefore, $$ {\Bbb P}(O|S=k)=\frac{n+1-k}{n+1},\qquad(2) $$ and similarly, if $S=k+1$, we will draw $k$ success cards out of the first $n$ draws just when the last, undrawn, card is a success, so $$ {\Bbb P}(O|S=k+1)=\frac{k+1}{n+1}.\qquad(3) $$ Substituting (2) and (3) into (1) gives $$ {\Bbb P}(S=k+1|O)=\frac{k+1}{n+2}. $$

For the second question, consider the following 2 processes:

  • Process 1: (Polya's urn) Start with an urn with 1 white and 1 black ball. Repeatedly draw a ball from the urn and return it to the urn, together with another ball of the same color.
  • Process 2: Let $U$ be a random variable which is uniformly distributed in $[0,1]$, and start by picking a random probability $p\sim U$. Then repeatedly flip a coin whose probability of heads is $p$, and at each flip, draw a white ball if the coin is heads, and a black ball if it is tails.

After any given sequence $T$ of balls has been drawn, we can compute the probability, $p_T$, of drawing another white ball. In Polya's urn, this probability obviously depends only on the number of white and black balls in $T$; if there are $k$ white and $n-k$ black balls in $T$, then the urn will now contain $k+1$ white and $n-k+1$ black balls, so $$p_T=\frac{k+1}{n+2}.$$ In Process 2, we can again use Bayes' Theorem. Letting $W$ be the event that the next ball drawn is white, $$ p_T={\Bbb P}(W|T)=\frac{{\Bbb P}(W\ \hbox{and}\ T)}{{\Bbb P}(T)}= \frac{{\Bbb E}({\Bbb P}(W\ \hbox{and}\ T|U))}{{\Bbb E}({\Bbb P}(T|U))}.\qquad(4)$$ However, given that $U=p$, the probability that we draw $T$ is just $p^k (1-p)^{n-k}$, and the probability that we draw $T$ followed by another white ball is $p^{k+1} (1-p)^{n-k}$. Substituting this into (4), $$ p_T={\Bbb P}(W|T)=\frac{ \int_{0\le p\le 1} p^{k+1} (1-p)^{n-k} \, dp } { \int_{0\le p\le 1} p^k (1-p)^{n-k} \, dp }=\frac{k+1}{n+2}. $$ This is another form of Laplace's rule of succession: given an event with unknown probability $p$, if we assume a uniform prior and $k$ successes out of $n$ independent trials, then the mean posterior value of $p$ is $\frac{k+1}{n+2}$.

Notice that both processes gave the same value of $p_T$, so these two processes give statistically indistinguishable sequences of balls, in the sense that they give the same sequences of observations with the same probability. But in Process 2, by the SLLN, the proportion of white balls drawn approaches $p$ a.s., and $p\sim U$. Therefore, the same is true for Process 1: In Polya's urn, the proportion of white balls drawn approaches a limit a.s., and this limit is uniformly distributed on $[0,1]$.

David Moews
  • 16,651
  • Thank you so much for the clear explanation! The latter explanation is a bit over my head right now, but I'll keep it in mind as I keep studying. – John Lee Jan 25 '12 at 23:24