Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

© Oxford University Press 2004. All rights reserved. Many organisms possess multiple sensory systems, such as vision, hearing, touch, smell, and taste. The possession of such multiple ways of sensing the world offers many benefits. These benefits arise not only because each modality can sense different aspects of the environment, but also because different senses can respond jointly to the same external object or event, thus enriching the overall experience - for example, looking at an individual while listening to them speak. However, combining information from different senses also poses many challenges for the nervous system. In recent years there has been dramatic progress in understanding how information from different sensory modalities gets integrated in order to construct useful representations of external space; and in how such multimodal representations constrain spatial attention. Such progress has involved numerous different disciplines, including neurophysiology, experimental psychology, neurological work with brain-damaged patients, neuroimaging studies, and computational modelling. This volume brings together the leading researchers from all these approaches, to present aan integrative overview of this central topic in cognitive neuroscience.

Original publication

DOI

10.1093/acprof:oso/9780198524861.001.0001

Type

Book

Publication Date

22/03/2012

Pages

1 - 340