Skip to content ↓

What number is halfway between 1 and 9? Is it 5 — or 3?

A new information-theoretical model of human sensory perception and memory sheds light on some peculiarities of the nervous system.
Press Inquiries

Press Contact:

Caroline McCall
Phone: 617-253-2700
MIT News Office
Close
Credits:
Graphic: Christine Daniloff

Ask adults from the industrialized world what number is halfway between 1 and 9, and most will say 5. But pose the same question to small children, or people living in some traditional societies, and they're likely to answer 3.

Cognitive scientists theorize that that's because it's actually more natural for humans to think logarithmically than linearly: 30 is 1, and 32 is 9, so logarithmically, the number halfway between them is 31, or 3. Neural circuits seem to bear out that theory. For instance, psychological experiments suggest that multiplying the intensity of some sensory stimuli causes a linear increase in perceived intensity.

In a paper that appeared online last week in the Journal of Mathematical Psychology, researchers from MIT's Research Laboratory of Electronics (RLE) use the techniques of information theory to demonstrate that, given certain assumptions about the natural environment and the way neural systems work, representing information logarithmically rather than linearly reduces the risk of error.

The new work was led by John Sun, a graduate student in Vivek Goyal's Signal Transformation and Information Representation (STIR) Group at RLE. Joining Sun and Goyal on the paper are Lav Varshney, a researcher at IBM's Watson Research Center and a former graduate student in Goyal's group, who remains a research affiliate at RLE, and Grace Wang, formerly a neurophysiologist at the Massachusetts Eye and Ear Infirmary.

Usually, STIR members research signal-processing problems in areas such as optical imaging or magnetic resonance imaging. So publishing a paper in a psychology journal might seem a little out of character.

"Although this problem seems very removed from what we do naturally, it's actually not the case," Sun says. "We do a lot of media compression, and media compression, for the most part, is very well-motivated by psychophysical experiments. So when they came up with MP3 compression, when they came up with JPEG, they used a lot of these perceptual things: What do you perceive well, what don't you perceive well?"

One of the researchers' assumptions is that if you were designing a nervous system for humans living in the ancestral environment — with the aim that it accurately represent the world around them — the right type of error to minimize would be relative error, not absolute error. After all, being off by four matters much more if the question is whether there are one or five hungry lions in the tall grass around you than if the question is whether there are 96 or 100 antelope in the herd you've just spotted.

The STIR researchers demonstrate that if you're trying to minimize relative error, using a logarithmic scale is the best approach under two different conditions: One is if you're trying to store your representations of the outside world in memory; the other is if sensory stimuli in the outside world happen to fall into particular statistical patterns.

If you're trying to store data in memory, a logarithmic scale is optimal if there's any chance of error in either storage or retrieval, or if you need to compress the data so that it takes up less space. The researchers believe that one of these conditions probably pertains — there's evidence in the psychological literature for both — but they're not committed to either. They do feel, however, that the pressures of memory storage probably explain the natural human instinct to represent numbers logarithmically.

In their paper, the MIT researchers also look at the statistical patterns that describe volume fluctuations in human speech. As it turns out, those fluctuations are well approximated by a normal distribution — a bell curve — but only if they're represented logarithmically. Under such circumstances, the researchers show, logarithmic representation again minimizes the relative error.

The STIR researchers' information-theoretic model also fits the empirical psychological data in other ways. One is that it predicts the point at which human sensory discrimination will break down. With sound volume, for instance, experimental subjects can make very fine distinctions within a range of values, "but experimentally, when we get to the edges, there are breakdowns," Sun says.

Similarly, the model does a better job than its predecessors of describing brain plasticity. It provides a framework in which a straightforward application of Bayes' theorem — the cornerstone of much modern statistical analysis — accurately predicts the extent to which predilections hard-wired into the human nervous system can be revised in light of experience.

“There’s a whole bunch of different animal species,” says Adam Reeves, a professor of psychology at Northeastern University, “and a whole bunch of different sensory mechanisms, like hearing and vision, and different aspects of all of them, and then taste, and smell, and so on, all of which follow exactly the same law” — a logarithmic relationship between stimulus intensity and perceived intensity. “Biology is very variable, right? So how come all these organisms come up with the same law? And how come the law is so precise? It’s a major philosophical problem, actually.”

In attempting to solve that problem, Reeves says, the MIT researchers offer “a very powerful argument.” Before he’s ready to declare the problem solved, however, Reeves would like to see an examination of the cases in which the law doesn’t hold.

“The real proof would be if you get a different type of statistical distribution of the stimulus information, then you get a different type of a law,” Reeves says. “Night vision, for example, shows some very strong failures of [the] law.”

Related Links

Related Topics

More MIT News

Gene Keselman headshot

Faces of MIT: Gene Keselman

At MIT, Keselman is a lecturer, executive director, managing director, and innovator. Additionally, he is a colonel in the Air Force Reserves, board director, and startup leader.

Read full story