Seth Roberts writes this morning:
How can you tell when an expert is exaggerating? His lips move.
Some people will misunderstand his point. Roberts is not saying experts exaggerate their conclusions per se. He’s saying experts exaggerate their confidence in their conclusions.
If an expert says that playing a harmonica decreases your risk of influenza by 10%, she’s probably not making that figure up out of thin air (though I am). There probably was some data that implied the 10% figure. It’s not that the data suggested 5% and the scientist said “Let’s call it 10%.” But the quality and quantity of the data may not justify rushing out to buy a harmonica.
One reason experts exaggerate their confidence is that they may be at a loss for words to explain their degree of uncertainty to a popular audience. Journalists can understand “Harmonica playing is good for you” though they probably cannot understand confidence intervals, Bayes factors, or the differences between retrospective versus prospective experiments. (The experts may not really unstand these things either.)
Another reason for exaggeration is that you don’t get the attention of the press by making tentative claims. This creates an incentive to suppress uncertainty. But even if experts were transparent regarding their uncertainty, there would still be a selection bias: experts who are sincerely more confident are more likely to be heard.
I liked the first comment on Roberts’ post:
I tended to rate my colleagues partly by how often the words “I don’t know” passed they lips. Often = good.