Scientific American has an article suggesting that our natural sense of numbers may operate on a logarithmic scale rather than a linear scale.

It has long been known that our senses often work on a logarithmic scale. For example, sound intensity is measured in decibels, a logarithmic scale. Pitch is perceived on a logarithmic scale, as the Pythagoreans discovered. When moving up a chromatic scale, it’s not the *differences* in frequencies but the *ratios* of frequencies that are constant. An octave is a ratio of 2 to 1, so a half step is a ratio of 2^{1/12} to one since there are 12 half step in an octave.

The Statistical Modeling, Causal Inference, and Social Science blog gives an interesting example combining linear and logarithmic perceptions. They quote a study suggesting that when deciding whether to walk to a new well based on information regarding arsenic levels, Bangladeshis perceived “distance to nearest safe well” linearly but perceived “arsenic level” logarithmically.

Nice post. I just stumbled on it from your tweet

I’m no expert in micro-economics or psychophysics, but:

I’d guess that it depends on the relationship between the raw variable and utility.

I can see how distance travelled for Bangladeshis could be roughly linearly related to time and that the utility of time saved would roughly be linear to the amount of time saved.

Likewise, presumably human perceptual systems are designed to logarithmically transform raw stimuli (e.g., sound waves, light intensity) because the adaptive utility of distinctions operate logarithmically (i.e., little differences at low levels are important; only big differences at high levels are important).

It’s also interesting to think about situations where the transformation is not adaptive.