You are viewing a free preview of this lesson.
Subscribe to unlock all 10 lessons in this course and every other course on LearningBro.
Bayesian statistics is an approach to inference that treats probability as a degree of belief, updated in light of new evidence. While the frequentist approach (covered in earlier lessons) interprets probability as a long-run frequency, the Bayesian approach offers a principled way to combine prior knowledge with data.
At the heart of Bayesian statistics is Bayes' Theorem:
P(A | B) = P(B | A) × P(A) / P(B)
In the context of statistical inference:
P(θ | data) = P(data | θ) × P(θ) / P(data)
| Term | Name | Meaning |
|---|---|---|
| P(θ | data) | Posterior | Updated belief about parameter θ after seeing data |
| P(data | θ) | Likelihood | How probable the data is, given θ |
| P(θ) | Prior | Belief about θ before seeing data |
| P(data) | Marginal likelihood (Evidence) | Total probability of the data across all possible θ values |
A disease affects 1% of the population. A test has:
Subscribe to continue reading
Get full access to this lesson and all 10 lessons in this course.