Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

The technique of 306-channel magnetoencephalogaphy (MEG) was used in eight healthy volunteers to test whether silent lip-reading modulates auditory-cortex processing of phonetic sounds. Auditory test stimuli (either Finnish vowel /ae/ or /ø/) were preceded by a 500 ms lag by either another auditory stimulus (/ae/, /ø/ or the second-formant midpoint between /ae/ and /ø/), or silent movie of a person articulating /ae/ or /ø/. Compared with N1 responses to auditory /ae/ and /ø/ when presented without a preceding stimulus, the amplitudes of left-hemisphere N1 responses to the test stimuli were significantly suppressed both when preceded by auditory and visual stimuli, this effect being significantly stronger with preceding auditory stimuli. This suggests that seeing articulatory gestures of a speaker influences auditory speech perception by modulating the responsiveness of auditory-cortex neurons.

Type

Journal article

Journal

Neuroreport

Publication Date

22/12/2004

Volume

15

Pages

2741 - 2744

Keywords

Acoustic Stimulation, Adaptation, Physiological, Adult, Analysis of Variance, Auditory Cortex, Brain Mapping, Electroencephalography, Female, Functional Laterality, Humans, Lipreading, Magnetoencephalography, Male, Phonetics, Photic Stimulation, Speech Acoustics, Speech Perception