Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

A prerequisite for intelligent behavior is to understand how stimuli are related and to generalize this knowledge across contexts. Generalization can be challenging when relational patterns are shared across contexts but exist on different physical scales. Here, we studied neural representations in humans and recurrent neural networks performing a magnitude comparison task, for which it was advantageous to generalize concepts of "more" or "less" between contexts. Using multivariate analysis of human brain signals and of neural network hidden unit activity, we observed that both systems developed parallel neural "number lines" for each context. In both model systems, these number state spaces were aligned in a way that explicitly facilitated generalization of relational concepts (more and less). These findings suggest a previously overlooked role for neural normalization in supporting transfer of a simple form of abstract relational knowledge (magnitude) in humans and machine learning systems.

Original publication

DOI

10.1016/j.neuron.2021.02.004

Type

Journal article

Journal

Neuron

Publication Date

07/04/2021

Volume

109

Pages

1214 - 1226.e8

Keywords

alignment, generalization, magnitude, neural network, normalization, number, parietal cortex, representation, Adult, Algorithms, Brain, Electroencephalography, Female, Generalization, Psychological, Humans, Machine Learning, Male, Models, Neurological, Neural Networks, Computer, Psychomotor Performance, Size Perception, Transfer, Psychology, Young Adult