Mental models in DFS: Part 5, Bayesian Updating

Original post on RotoGrinders

Quick pointer, I found this one slightly harder to write & wrap my head around so any comments guiding me towards a more accurate conclusion would be appreciated!

What is Bayesian Updating?

“A statistics professor who travels a lot was concerned about the possibility of a bomb onboard his plane. He determined the probability of this and found it to be low, but not low enough. So now he always travels with a bomb in his suitcase. He reasons that the probability of two bombs being onboard would be infinitesimal.”

This is a classic joke among mathematicians to explain Bayesian Updating. In short, the theory of Bayesian Updating concerns how we should adjust probabilities when we encounter new data.

It takes it’s roots from an essay published in the 18th century by Thomas Bayes, an English minister. The essay, called “An Essay toward Solving a Problem is the Doctrine of Chances”, was only surfaced two years after the minister’s death but gave birth to the theory.

That theory rests on a basic premise, namely: take the odds of an event happening & adjust for new information. If you have a strong prior knowledge about an event then the probability you produce after Bayesian Updating will have a higher chance of accuracy.

Think of it like this:
Initial Beliefs + Recent Objective Data = A New and Improved Belief

 

Although Bayesian Updating, or Bayesian “Inference”, is now commonly used in statistical engineering, you don’t necessarily have to take a maths based approach to situations to get the most out of this mental model.

Using Bayesian Updating in sports betting

If you do want to take a methodical approach to Bayesian Updating when making smart decisions, consider the following put forward by Pierre-Simon Laplace who was a French mathematician & astronomer in order to boil Bayes’ theorem into a formula:

P (A/B) = P(B/A) x P(A) / P(B)

If you want to know the probability of A when you know that B is also present (given), you can get the answer by multiplying your prior estimation of A (Probability of A) by how much more likely B is when A is present (i.e. P(B|A)/P(B)).

Let’s take a simple football match betting example to start. Let’s say it’s Chelsea v Man Utd. Chelsea have an overall head-to-head win percentage of 31% against Man Utd. We also know that when Cheslea win against Utd it rains 11% of the time, compared to the usual likelihood of rain in a Chelsea Match of 10%. So:

  • P(A) = probability that Chelsea beats Man Utd = 31%
  • P(B) = Probability of rain in a Chelsea match = 10%
  • P(A|B) = Probability of rain in a football game when Chelsea beats Utd = 11%

Let’s now imagine it’s game day & the weather forecast is rain. A simple Bayesian update will show you that P(A|B)=P(A)*P(B|A)/P(B)= 31%*11%/10%= 34.1%.

So Chelsea have a 34.1% chance of winning.

Using Bayesian Updating in DFS

You don’t have to go through all this when playing DFS. All that’s important is understanding the underlying logic that when new information is presented, you should update your existing assumptions.

It can be as simple as a last minute formation change by a manager from 4-4-2 to 3-5-2 & asking yourself if the midfielder you’ve initially drafted will be as effective in this formation.

However, Bayesian Updating does have it’s limitations. Just because the volume of information increases every day, doesn’t make the future any less predictable.

Shane Parrish recalls the example of a turkey who is fed every day. Every day that he’s fed he thinks that in general, life is a breeze where he’s fed by friendly humans every day. Externally we know this is because he’s being prepared for slaughter.