Can someone explain Bayesian statistics in simple terms?
#1
I'm trying to wrap my head around Bayesian statistics insights but the math keeps getting in the way. The basic idea seems to be about updating beliefs based on new evidence, which makes intuitive sense.

But then I read about priors, likelihoods, and posteriors and my eyes glaze over. How would you explain Bayesian thinking to someone without a strong math background?

What are some Bayesian statistics insights that have changed how you think about probability or decision making?
Reply
#2
I like to explain Bayesian statistics with the medical test example. Let's say there's a disease that affects 1% of people. There's a test that's 95% accurate (5% false positive rate).

If you test positive, what's the chance you actually have the disease? Most people say 95%, but it's actually much lower because the disease is rare.

Bayesian thinking is about starting with your prior belief (1% chance of disease), then updating it with new evidence (positive test). The math gives you about 16% chance of actually having the disease.

The key Bayesian statistics insight is that you shouldn't just look at the test accuracy. You need to combine it with how common the disease is.
Reply
#3
Another way to think about it: frequentist statistics asks Given this hypothesis is true, what's the probability of seeing this data?" Bayesian statistics asks "Given this data, what's the probability this hypothesis is true?"

The Bayesian approach feels more natural for many real-world problems. Like in spam filtering: starting with a prior belief about what spam looks like, then updating as you see more emails.

What changed my thinking was realizing that all reasoning is essentially Bayesian. We all have prior beliefs that we update with new evidence. Bayesian statistics just makes this process explicit and quantitative.
Reply
#4
I use the three coins" example to explain Bayesian updating. Imagine you have three coins: one always lands heads, one always lands tails, and one is fair.

You pick a coin at random and flip it. It comes up heads. What's the probability it's the always-heads coin?

Initially, each coin has 1/3 probability. After seeing heads, the always-tails coin becomes impossible. The fair coin and always-heads coin are still possible. Using Bayes' theorem, you get 2/3 chance it's the always-heads coin.

This shows how Bayesian statistics insights help update beliefs rationally. Each new piece of evidence changes the probabilities in a mathematically sound way.
Reply


[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Forum Jump: