Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.

How the brain combines information from different sensory modalities and of differing reliability is an important and still-unanswered question. Using the head direction (HD) system as a model, we explored the resolution of conflicts between landmarks and background cues. Sensory cue integration models predict averaging of the two cues, whereas attractor models predict capture of the signal by the dominant cue. We found that a visual landmark mostly captured the HD signal at low conflicts: however, there was an increasing propensity for the cells to integrate the cues thereafter. A large conflict presented to naive rats resulted in greater visual cue capture (less integration) than in experienced rats, revealing an effect of experience. We propose that weighted cue integration in HD cells arises from dynamic plasticity of the feed-forward inputs to the network, causing within-trial spatial redistribution of the visual inputs onto the ring. This suggests that an attractor network can implement decision processes about cue reliability using simple architecture and learning rules, thus providing a potential neural substrate for weighted cue integration.

Original publication

DOI

10.1098/rstb.2012.0512

Type

Journal article

Journal

Philos Trans R Soc Lond B Biol Sci

Publication Date

05/02/2014

Volume

369

Keywords

attractor dynamics, head direction cells, path integration, sensory cue integration, vestibular system, vision, Animals, Brain, Cues, Evoked Potentials, Histocytochemistry, Male, Models, Neurological, Motion Perception, Neurons, Rats, Spatial Behavior, Video Recording