October 22, 2009 3

# What’s Wrong with Probability Notation?

Sometimes I wonder why many humans (me included) have trouble understanding probability. In cognitive science, probabilistic models are taking over most areas. Still, most people struggle with them. Could it be that the notation is just hard to swallow? What’s Wrong with Probability Notation? is a magnificent post that gives some basic reasons:

The first two issues arise in the usual expression of the first step of Bayes’s rule,

$p(x|y) = p(y|x)p(x) / p(y)$,

where each of the four uses of $p()$ corresponds to a different probability function! In computer science, we’re used to using names to distinguish functions. So $f(x)$ and $f(y)$ are the same function $f$ applied to different arguments. In probability notation, $p(x)$ and $p(y)$ are different probability functions, picked out by their arguments.

This is one clear communication problem. Ideally we want more people to follow probabilistic reasoning. Doctors, judges, etc all show significant struggles when given probabilities (see e.g., Helping Doctors and Patients Make Sense of Health Statistics).

But how do we tackle this problem? Changing notation is easier said than done. In fact, anyone departing from traditional notation will have to convince reviewers that his notation is better… and add to the risk of cause a less-than-ideal impression.

Any ideas?

If you enjoyed this post, make sure you subscribe to my RSS feed!

### 3 Responses to “What’s Wrong with Probability Notation?”

1. Andre Ariew says:

Fortunately there is an easy answer to your question. In a phrase, “natural frequencies”. Gerd Gigerenzer wrote a book about it and how to think about working through Bayesian problems very easily. I teach a large section critical thinking course (based in philosophy) at the University of Missouri and this is one of the things I teach students because after all, every one of them is going to need Bayes to interpret some sort of medical emergency.