When I was on holiday, I started to read Daniel Kahneman’s bestselling book “Thinking fast and slow”. What struck me while reading it, having just started the statistics heavy population medicine block, was how much weight we place on biases, and how difficult it is to think in terms of statistics.
The book starts out by discussing what he and his late colleague, Amos Tversky, coined “expert intuition”, and how we can make good judgments in split seconds, based on cues that give us access to information stored in memory. In familiar situations, our instincts guide us to act in certain manners, most often to great avail. However, sometimes our intuitions can be way off.
This is where statistics come into play.
In one of last week’s lectures, our professor taught us about something quite thought-provoking. Say you have a population with a 30% prevalence of mastitis, which is an inflammation of the breast or udder tissue, often found in cows. If you were to run multiple diagnostic tests on 100 cows, given a test sensitivity and specificity of 90% accuracy for test positive answers, and 80% accuracy for test negative answers, you’d get a margin of error where 14% of the healthy cows would get a false test positive, and 3% of the cows with mastitis would get a false test negative.
By taking the initial test results at face value, you’d run the risk of misdiagnosing multiple individuals. When our professor demonstrated this I realized why understanding probability and statistics is a crucial part of medicine. Although I’ve spent all day banging my head against the wall trying to make sense of all these tables and diagrams, I can appreciate the importance and instrumental value this will have for me in the future in aiding me in critical thinking.