One of the most well-worn examples in introductions to Bayesian reasoning is testing for rare diseases: if the prior probability that a patient has a disease is sufficiently low, the probability that the patient has the disease conditional on a positive diagnostic test result may also be low, even for very accurate tests. One might hope that every epidemiologist would be familiar with this textbook problem, but this New York Times story suggests otherwise:
For months, nearly everyone involved thought the medical center had had a huge whooping cough outbreak, with extensive ramifications. […]
Then, about eight months later, health care workers were dumbfounded to receive an e-mail message from the hospital administration informing them that the whole thing was a false alarm.
Now, as they look back on the episode, epidemiologists and infectious disease specialists say the problem was that they placed too much faith in a quick and highly sensitive molecular test that led them astray.
While medical professionals can modestly improve their performance on inventories of cognitive bias when coached, we should not overestimate the extent to which formal instruction such as statistics or epidemiology classes will improve actual behavior in the field.
Steve Sailer claims that he almost died because doctors thought whooping cough was extinct, when they were actually thinking of whooping cranes. He thinks the same problem of words that sound similar is the case with Iran/Iraq and Sunnis/Shiites.
Institutional incentives would certainly help to avoid pseudo-epidemics, both by motivating doctors to expend mental effort searching for reasons not to proceed, and by allocating decision-making power to better Bayesians.
However, the example remains troubling in a world where constructing such incentives is costly, and it is very difficult to incentivize careful thinking across all domains. Doctors unable to generalize from their (mandatory!) study of statistics to an almost identical real life situation, like the scientist who is a mystic outside the laboratory (http://www.overcomingbias.c..., cannot be fully trusted in any but the most carefully structured and incentivized transactions, which are rare in medicine because of information asymmetries.