The Bayesian Approach to Data Analysis and Prediction. Implications for the Superbowl
From Pure Logic to Common Sense
Howard Rankin PhD, Science Director, IntualityAI
In today’s data driven world it is easy to be impressed by an apparently accurate statistic from a study of one sort or another. For example, in OJ Simpson’s murder trial, while Simpson’s blood matched with that found at the crime scene, those characteristics were shared by 1 in 400 people. Simpson’s defense lawyers argued that 1 in 400 was a misleading statistic because it implied that there were thousands of people with the same characteristics and therefore should not be considered as evidence that he committed the crime.
?In the same trial the prosecution argued that Simpson had been violent towards his wife. The defense argued that there was only one woman killed for every 2500 who were victims of spousal abuse and that therefore the argument was unreasonable.
?Using Bayesian logic, Gerd Gigerenzer, a leader in the field of cognitive bias, suggested that more context was needed for an accurate picture. Gigerenzer pointed out that “Simpson's wife had not only been subjected to domestic violence, but rather subjected to domestic violence (by Simpson)?and?killed (by someone).”
?Gigerenza concluded, “the chances that a batterer actually murdered his partner, given that she has been killed, is about 8 in 9 or approximately 90%".
?
Frequency tree of 100 000 battered American women showing the base rate fallacy made by the defense in the?OJ Simpson trial.
?Gigerenza was using a Bayesian approach, and the defense a frequentist approach.
What’s the difference?
?There is information, and there’s more information. When do you stop looking for more information and greater context?
?Explaining Bayesian vs. Frequentist approaches
In simple language, the Bayesian approach to data analysis treats probability as how confident you are about something based on prior knowledge and new evidence. It allows you to update your beliefs as new data comes in.
?On the other hand, the frequentist approach sees probability as the long-term frequency of events. It doesn't incorporate prior beliefs and focuses on data from the current study, not updating based on new data.
?Imagine you're trying to guess the outcome of a coin flip. Here’s a simple way to understand the two approaches:
?Bayesian Approach
领英推荐
?
Frequentist Approach
?In a Nutshell
?Both methods help us make sense of data, but they do it in different ways: one by incorporating what you already know (Bayesian), and the other by focusing only on the data you collect (Frequentist).
IntualityAI uses the Bayesian method in analysis. This allows for constant updating of information? and provides a more realistic view than the information derived from one source at one particular time.
?A simple example would be a football game prediction, like the Superbowl.
?Using a frequentist approach, the announcer will say something like “The Chiefs haven’t lost to the Eagles in ten years,” and leave it at that.
?A Bayesian approach would have the announcer saying that “While the Chiefs haven’t lost to the Eagles in 10 years, the teams have changed a lot over the decade, and the Chiefs have several major injuries tonight that alter their prospects.” And that prediction will change based on subsequent incoming information, like further injury updates and other variables that could influence the outcome.
(Incidentally IntualityAI has the best record over the past decade of NFL predictions, with a record of around 60%. You can check it out here: https://intualityai.com/sports-betting/ )
?Clearly, the Bayesian approach offers a much more nuanced view of any situation and goes beyond the data accumulated just one time.
?In a fundamental paper on the distinction between Bayesian and frequentist approaches, Oaksford and Chater (2009) conclude…
?“Bayesian Rationality is part of a larger movement across the brain and cognitive sciences – a movement which sees cognition as centrally concerned with uncertainty; and views Bayesian probability as the appropriate machinery for dealing with uncertainty. Probabilistic ideas have become central to theories of elementary neural function, motor control), perception, language processing and high-level cognition."
?“In arguing that commonsense reasoning should be understood in terms of probability, we are merely recasting Laplace’s (1814/1951) classic dictum concerning the nature of probability theory: “The theory of probabilities is at bottom nothing but common sense reduced to calculus.””
?
Reference
?Oaksford, M., and Chater, N. (2009). Precis of Bayesian Rationality: The Probabilistic Approach to Human Reasoning. BEHAVIORAL AND BRAIN SCIENCES (2009) 32, 69–120