Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Stimuli that elicit a prepotent but incorrect response are typically associated with an enhanced electrophysiological N2 that is thought to index the operation of a control process such as inhibition or conflict detection. However, recent studies reporting the absence of the N2 modulation in go/no-go tasks involving auditory stimuli challenge this view: It is not clear why inhibition or conflict detection should be sensitive to the modality of the stimulus. Here we present electrophysiological data from a go/no-go task suggesting that the relative size of the N2 modulation in visual and auditory tasks depends on the perceptual overlap between the go and no-go stimuli. Stimuli that looked similar but sounded different were associated with a typical visual N2 modulation and the absence of an auditory N2 modulation, consistent with previous findings. However, when we increased the perceptual overlap between the auditory stimuli, a large no-go N2 was observed. These findings are discussed in terms of existing hypotheses of the N2, and clarify why previous studies have not found an N2 modulation in auditory go/no-go tasks.

Original publication

DOI

10.1046/j.1469-8986.2003.00128.x

Type

Journal article

Journal

Psychophysiology

Publication Date

01/2004

Volume

41

Pages

157 - 160

Keywords

Adult, Brain Mapping, Cerebral Cortex, Conflict (Psychology), Discrimination Learning, Electroencephalography, Evoked Potentials, Female, Humans, Inhibition (Psychology), Male, Pattern Recognition, Visual, Psychomotor Performance, Reaction Time, Signal Processing, Computer-Assisted, Speech Perception