Bayesian Coin Flip

Update beliefs with evidence

This simulation demonstrates Bayesian inference in action. Start with a prior belief about whether a coin is fair or biased, then update your belief as you observe coin flips. Watch how the posterior probability changes with each new piece of evidence!

?

P(Fair | Data)

50.0%

P(Biased | Data)

50.0%

Heads Count

0

Tails Count

0

Flip History

No flips yet

Understanding Bayesian Inference

Bayesian inference is a method of statistical inference that updates the probability of a hypothesis as more evidence becomes available. It combines:

Prior: Your initial belief before seeing any data (e.g., "I think there's a 50% chance this coin is fair")

Likelihood: The probability of observing the data given each hypothesis (e.g., "If the coin is fair, the probability of heads is 0.5")

Posterior: Your updated belief after seeing the data, calculated using Bayes' theorem:

P(Hypothesis | Data) = P(Data | Hypothesis) × P(Hypothesis) / P(Data)