A prerequisite for intelligent behavior is to understand how stimuli are related and to generalize this knowledge across contexts. Generalization can be challenging when relational patterns are shared across contexts but exist on different physical scales. Here, we studied neural representations in humans and recurrent neural networks performing a magnitude comparison task, for which it was advantageous to generalize concepts of "more" or "less" between contexts. Using multivariate analysis of human brain signals and of neural network hidden unit activity, we observed that both systems developed parallel neural "number lines" for each context. In both model systems, these number state spaces were aligned in a way that explicitly facilitated generalization of relational concepts (more and less). These findings suggest a previously overlooked role for neural normalization in supporting transfer of a simple form of abstract relational knowledge (magnitude) in humans and machine learning systems.
Journal article
Neuron
07/04/2021
109
1214 - 1226.e8
alignment, generalization, magnitude, neural network, normalization, number, parietal cortex, representation, Adult, Algorithms, Brain, Electroencephalography, Female, Generalization, Psychological, Humans, Machine Learning, Male, Models, Neurological, Neural Networks, Computer, Psychomotor Performance, Size Perception, Transfer, Psychology, Young Adult