The approach and interpretation of probability associated with Bayes' theorem; usually used as opposed to the frequentist approach. It can be seen as an extension of logic that enables reasoning with propositions whose truth or falsity is uncertain. A Bayesian probabilist starts with some prior probability, and evaluates the evidence in favour of a hypothesis by combining the prior with the likelihood function of the observed data.
Questions tagged [bayesian]
2030 questions
43
votes
3 answers
Bayes rule with multiple conditions
I am wondering how I would apply Bayes rule to expand an expression with multiple variables on either side of the conditioning bar.
In another forum post, for example, I read that you could expand $P(a,z \mid b)$ using Bayes rule like this
(see…

maogenc
- 1,165
6
votes
3 answers
What is the extension of Bayesian Network into cyclic graph?
The wikipage of Bayesian Network says
"Formally, Bayesian networks are directed acyclic graphs whose nodes represent random variables in the Bayesian sense"
But in the model I need to build, cyclic structure of constraint is necessary. For example,…

Matt
- 171
5
votes
2 answers
Is the Bayesian Prior representing a hypothesis with no data, or with all data?
I have an understanding question about the Bayes' Theorem: in
$$p(z|x) = \frac{p(x|z)p(z)}{p(x)},$$
the term $p(z)$ is usually interpreted as the prior probability distribution of a hypothesis $z$ before observing any data $x$.
However, if we write…

Jacob
- 53
4
votes
2 answers
Variable Selection for Bayesian Linear Model
Consider the Bayesian linear model $y_i\sim N(x_i\beta,\sigma^2 ), i=1,\ldots,n$ where $$\sum_{i=1}^n x_i=0, \sum_{i=1}^n x_i^2 =1, \sum_{i=1}^n x_i y_i=\gamma $$ The prior for $\beta$ and the dummy variable $z$ is given by $$\pi (\beta \mid…

Eric
- 41
4
votes
1 answer
What's the difference between Maximum a posteriori and Bayes' rule?
What's the difference between Maximum a posteriori and Bayes' rule? They look similar, except that I do understand Bayes' rule and I don't understand MAP. The people I asked - who work in math and computer science - have never heard about Maximum a…

user68610
- 343
- 2
- 7
4
votes
2 answers
Application of Bayes Theorem
I am reading Nate Silver's book "The Signal and the Noise" and have a question about Bayes Theorem. I've created my own example and am trying to wrap my mind around the conclusion.
Let's say, before any information, I think there is a 5% chance…

jim_shook
- 281
4
votes
3 answers
Simple Bayes Theorem question
You know there are 3 boys and an unknown number of girls in a nursery at a hospital. Then a woman gives birth a baby, but you do not know its gender, and it is placed in the nursery. Then a nurse comes in a picks up a baby and it is a boy. Given…

Emir
- 2,213
3
votes
1 answer
Bayesian updating game
We have two urns $A$ and $B$. $A$ urn contains $2$ white and $1$ black balls while urn $B$ contains $2$ black and $1$ white balls. Urns are called states of nature and can each happen with the probability of $0.5$.
An urn is selected randomly and…

Vytas
- 31
2
votes
1 answer
Finding a posterior distribution of an exponential distribution parameter theta
Suppose that $X_1, ... , X_n$ each have an exponential distribution with parameter $\theta$, and suppose that the prior for $\theta$ is an exponential distribution with parameter $\lambda$. Find the posterior distribution of $\theta$.
I have to…

Tom
- 63
2
votes
1 answer
Posterior Distribution with prior standard exponential (mean 1) and data distribution of poisson
So I have the likelihood being:
$\prod^{n}_{i=1}(\frac{\lambda^{x}e^{-\lambda}}{x!})$
which is proportional to
$\lambda^{\sum_{i=1}^{n}x_{i}}e^{-n\lambda}$
The prior is standard exponential $e^{-\lambda}$
So the posterior is…

Gamecocks99
- 1,023
2
votes
0 answers
Laplace approximation of the likelihood Bayesian
I need help with the following question:
Consider m observations $(y_1; n_1); ... ; (y_m; n_m)$, where $y_i \sim Bin(n_i; θ_i)$ are binomial variables.
Assume that $θ_i \sim w_1Beta(α_1; β_1) + w_2Beta(α_2; β_2)$ are mixture from two Beta…

Statstudent
- 21
2
votes
1 answer
Why is the prior distribution of unknown probabilities uniform?
Specifically the prior for the probability of an unknown binary variable, such as in the context of the rule of succession.
In every proof I've seen of the rule of succession, it starts with the assumption of a uniform prior of the probability; i.e.…

zbw
- 131
2
votes
1 answer
Applying the Bayes theorem
a) A research institute associated with the Olympics claims that its drug test will detect steroid use (that is, show a positive result for an athletic who uses steroids) 95% of the time. Your friend on Canada’s hockey team has just tested positive.…

MathGeek
- 886
2
votes
2 answers
How To Interpret Two Bayesian Credible Intervals
I come from a frequentist mindset by training, unfortunately. As such, I'm conditioned to interpret experimental results as either a) reject some null hypothesis, or b) fail to reject it, all based on a 95% level of confidence. I wish to understand…
2
votes
1 answer
Proving stochastic boundedness in rate of contraction posterior distribution
Consider a family of probability distributions $P_\theta$ indexed by $\theta \in \Theta$. The parameter space is endowed with some metric $d$. We assume that there is a true parameter $\theta_0$, and we are interested of the convergence of the…

Scipio
- 380