Skip to content

StatSquid

Squid-Speed Stats: The Good, the Bad, and the Gaussian

rolling_dice

Bayes’ Theorem: Your Brain’s Hidden Update Button

Posted on September 9, 2025September 18, 2025 By squid_admin No Comments on Bayes’ Theorem: Your Brain’s Hidden Update Button

Our natural way of thinking has been around long before math formulas showed up—no egg-chicken dilemma here. The fun part is, what we do instinctively while making decisions can actually be captured and explained with an elegant formula.

Research suggests the average adult makes around 35,000 decisions a day (source). Most of these decisions aren’t the result of deep analysis—they’re quick guesses based on whatever information we have at the moment. Naturally, when new information comes in, we adjust those guesses.

Take football, for example. Before kickoff, you might say your favorite team has an 80% chance of winning—based on their form, home advantage, and the fact that their star striker is on the pitch. But 30 minutes in, the striker gets a red card. Now the team is down to 10 players. Suddenly, that 80% feels way too optimistic, and maybe you revise it down to 69%.

This is exactly what Bayes’ theorem formalizes: updating your beliefs when reality throws new evidence at you. The first guess (80%) is called the prior probability. After the striker’s red card, the revised number (69%) becomes the posterior probability—your updated belief.

Let’s take a fun example. If you’ve seen Ocean’s 13, you’ll remember how Danny Ocean’s crew manipulates the dice rolls with magnetically rigged dice. When triggered with Zippo lighters that emit a magnetic pulse, the dice obediently land on snake eyes (a single dot on each die). Now, suppose hitting snake eyes pays out $30 in winnings plus your $1 bet back—so $31 total.

But why such a generous 30-to-1 payout? Simple: because snake eyes (1,1) is a rare event. The casino knows the odds are stacked heavily against you, so the payout should be enticing enough to tempt you into betting.

Now, suppose we roll two fair dice together. What’s the chance of landing a single dot on both—aka snake eyes (1,1)?
There’s 1 favorable outcome out of 36 total outcomes, so:

P(snake eyes | fair dice) = 1/36 ≈ 2.78%

Think of this as the ideal baseline.

Now add a twist straight out of Ocean’s 13: Danny Ocean’s crew has loaded the dice so that snake eyes show up 30% of the time. In that case:

P(snake eyes | loaded dice) = 30%

Now, imagine you’re the owner of the casino (yes, even casino owners need to study probability—otherwise they end up like the clueless ones in the movies). You know snake eyes is a rare event, but here’s the real business-saving question:

“If snake eyes show up, what’s the probability the dice are loaded?”

That’s where Bayes’ theorem comes in.

We start with the setup: the dice could be either fair or loaded, and we assume both are equally likely.

  • P(fair dice)=50%
  • P(loaded dice)=50%

For the fair dice, the math is simple: out of 36 possible outcomes, only one gives snake eyes.

P(snake eyes ∣ fair dice) = 1/36

≈2.78%

Now let’s spice it up with Danny Ocean’s crew. They’ve tinkered with the dice so that snake eyes lands 30% of the time (about 11 out of 36).

P(snake eyes∣loaded dice)=30%

Now let’s say we want to know the marginal probability of getting a snake eye, which is sum of the probabilities of all events through which we can get snake eye. Here two such events are there.1) ideal scenario : we get snake eye given the dice are ideal or fair  2)loaded dice scenario :  we get a snake eye given the dice are loaded

I can rewrite the ideal scenario  as getting a snake eye and having fair dice – a joint probability of snake eye and fair dice under statistical dependence

P(snake-eyes and fair-dice) = P(snake-eyes | fair-dice).P(fair-dice)

I can rewrite the loaded dice scenario as getting a snake eye and having a loaded dice – a joint probability of snake eye and loaded dice under statistical dependence

P(snake-eyes and loaded-dice) = P(snake-eyes | loaded-dice).P(loaded-dice)

Marginal Probability of getting a snake-eyes:

P(snake-eyes) =  P(snake-eyes and fair-dice) + P(snake-eyes and loaded-dice)

= P(snake-eyes ∣ fair-dice)⋅P(fair-dice)+P(snake-eyes ∣ loaded-dice)⋅P(loaded-dice)

=(0.3 x 0.5) + (0.0278 x 0.5)

=0.1639

Baye’s theorem formula:

Posterior=(Likelihood×Prior)/Evidence

“Given that I saw snake eyes, what’s the probability the dice are loaded?”

  • Prior: How likely I thought the dice were loaded before the roll
  • Likelihood: How likely snake eyes are if the dice are loaded
  • Evidence: How likely snake eyes are overall (loaded + fair)
  • Posterior: How likely the dice are loaded now, after seeing snake eye

P(loaded-dice ∣ snake-eyes)=P(snake-eyes ∣ loaded-dice)⋅P(loaded-dice) / P(snake-eyes)

P(loaded-dice ∣ snake-eyes)= 0.3 x 0.5/0.1639

≈0.915

There’s about a 91.5% chance the dice were loaded, given that snake eyes occurred

So here’s the fun twist: Bayes’ theorem is just your natural way of guessing, written down as math.

Casino floor. Dice roll across the table.

  • Prior (the hunch): “Okay, before the roll, I’m 50–50. Could be fair dice, could be loaded. No reason to lean either way yet.”
  • Likelihood (the twist): Snake eyes appear. “Hmm… if these dice were loaded, that’s not surprising at all (30% chance). But if they were fair? That’s a miracle—only 1 in 36.”
  • Evidence (the big picture): “Alright, let’s weigh the two worlds together: fair + loaded. What’s the overall chance of seeing snake eyes in this casino?”
  • Posterior (the update): “Given what I just saw, my hunch flips. There’s now about a 5% chance these dice are loaded. Danny Ocean’s crew is at it again.”

 

 

 

 

Probability Tags:bayes theorem, conditional probability, data science, machine learning, posteriors, priors, probability, statistics, uncertainty decision making

Post navigation

Next Post: Learning Python for DA: Not Just Functions, But Frameworks for Thinking

Related Posts

stastically independent and dependent event Independence & Dependence: Probability’s Control Flow Probability

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Learning
  • Maths for Machine Learning
  • Numpy-Pandas
  • Probability
  • September 2025
September 2025
M T W T F S S
1234567
891011121314
15161718192021
22232425262728
2930  
     

Copyright © 2025 StatSquid.

Powered by PressBook Masonry Blogs

Powered by
...
►
Necessary cookies enable essential site features like secure log-ins and consent preference adjustments. They do not store personal data.
None
►
Functional cookies support features like content sharing on social media, collecting feedback, and enabling third-party tools.
None
►
Analytical cookies track visitor interactions, providing insights on metrics like visitor count, bounce rate, and traffic sources.
None
►
Advertisement cookies deliver personalized ads based on your previous visits and analyze the effectiveness of ad campaigns.
None
►
Unclassified cookies are cookies that we are in the process of classifying, together with the providers of individual cookies.
None
Powered by