Neural network gets an idea of number without counting Source: Celeste Biever
AN ARTIFICIAL brain has taught itself to estimate the number of objects in an image without actually counting them, emulating abilities displayed by some animals including lions and fish, as well as humans.
Because the model was not preprogrammed with numerical capabilities, the feat suggests that this skill emerges due to general learning processes rather than number-specific mechanisms. "It answers the question of how numerosity emerges without teaching anything about numbers in the first place," says Marco Zorzi at the University of Padua in Italy, who led the work.
The finding may also help us to understand dyscalculia - where people find it nearly impossible to acquire basic number and arithmetic skills - and enhance robotics and computer vision.
The skill in question is known as approximate number sense. A simple test of ANS involves looking at two groups of dots on a page and intuitively knowing which has more dots, even though you have not counted them. Fish use ANS to pick the larger, and therefore safer, shoal to swim in.
To investigate ANS, Zorzi and colleague Ivilin Stoianov used a computerised neural network that responds to images and generates new "fantasy" ones based on rules that it deduces from the original images. The software models a retina-like layer of neurons that fire in response to the raw pixels, plus two deeper layers that do more sophisticated processing based on signals from layers above.
The pair fed the network 51,800 images, each containing up to 32 rectangles of varying sizes. In response to each image, the program strengthened or weakened connections between neurons so that its image generation model was refined by the pattern it had just "seen". Zorzi likens it to "learning how to visualise what it has just experienced".
Infants demonstrate ANS without being taught, so the network was not preprogrammed with the concept of "amount". But when Zorzi and Stoianov looked at the network's behaviour, they discovered a subset of neurons in the deepest layer that fired more often as the number of objects in the image decreased. This suggested that the network had learned to estimate the number of objects in each image as part of its rules for generating images. This behaviour was independent of the total surface area of the objects, emphasising that the neurons were detecting number.
What's more, these firing patterns followed the trend shown by neurons inside the parietal cortex of monkeys. This region is involved in knowledge of numbers, suggesting that the model might reflect how real brains work.
To see if these patterns could give rise to ANS, the pair created a second program and fed it the firing patterns of the number-detecting neurons in the first program. They also fed it information on whether the number of objects associated with each firing pattern was bigger or smaller than a reference number. Trained in this way, the model could estimate whether a fresh image contained more or fewer than a given number of objects (Nature Neuroscience, DOI: 10.1038/nn.2996).
Brian Butterworth, who studies mathematical cognition at University College London, says the work breaks new ground. "It gives an explanation for how we estimate number when we can't count."
| }
|