“There is no point in being precise when you don’t know what you’re talking about.” — John Tukey
It’s a familiar trope in science fiction that the smartest character will answer questions with excess precision. On Star Trek, Scottie might give a number to one significant figure and Spock will correct him giving the same result to four significant figures.
The trope works on two levels. The innumerate viewer will think “Wow, the smart guy is really smart! He knows a lot more than the other guy.” The mathematically savvy viewer will see it as a kind of joke, intentional or unintentional. In the Star Trek series, I assume the writers are winking at the audience when precision is excessive. If Scottie says the ship will blow up in 20 seconds, there’s no point in Spock replying 19.81 seconds, because it would take more than 0.19 seconds for him to state his correction.
Excessive precision is not the mark of the expert. Nor is it the mark of the layman. It’s the mark of the intern.
If you ask for the circumference of a circle that is about a mile across, the expert and the layman will both say about three miles. The intern will pull out a calculator and say 3.14159265 miles.
When finding a circumference from a diameter, it’s obvious that the relative error in each are the same; multiplying by a constant like π doesn’t change the relative error. But it’s usually harder to tell how input precision and output precision are related. Assessing the accuracy of an answer is often a more sophisticated problem than coming up with an answer.
7 thoughts on “Excessive precision”
Do you want to be approximately correct or precisely wrong?
Back in my Stat arb days, we used to say: “You’re applying a degree of specificity that the problem doesn’t warrant.”
The annoying intern doesn’t pull out a calculator, but instead recites pi to 50 places
I think one issue is that our usual means of specifying precision are themselves imprecise. What does it mean when we say a distance is “about one mile”? The answer, of course, depends on the context. I would guess that some people saying that mean something like “between 0.5 and 2 miles”, whereas others mean “between 0.9 and 1.1 miles”. As I understand the rules for significant figures, some uncertainty in the last digit is allowed, so some people who say “about one mile” could in fact say “about 1.0 miles”.
In many cases, it might be better to communicate precision using intervals, rather than significant figures or relative/absolute error. The average person might not understand “20 with a relative error of 10%”, but probably would understand “between 18 and 22”, and could likely tell you that “between 18 and 22” plus “between 14 and 16” is “between 32 and 38”.
“It is the mark of an educated man to look for precision in each class of
things just so far as the nature of the subject admits.” Aristotle, Nicomachean Ethics (W. D. Ross)
@Nathan – as a theoretical chemist, when I say about 1 mile, I mean order of magnitude – anywhere between 1/3 of a mile and 3 miles is “about 1 mile” to me. When I’m hiking, I prefer “about 1 mile” to mean 1.0 miles. :)
A similar anecdote was once relayed to me to illustrate the situation quite clearly even to non-scientists and interns.
A man walked into an antiques store and got interested in an old vase. He asked the dealer how old the vase was. The dealer replies “2003 years.” When asked how he could possibly have such a precise answer, the dealer replied: “Well, I bought it 3 years ago and was told at the time that it was 2000 years old.”