Audiovisual synchrony perception for music, speech, and object actions.
Vatakis A., Spence C.
We investigated the perception of synchrony for complex audiovisual events. In Experiment 1, a series of music (guitar and piano), speech (sentences), and object action video clips were presented at a range of stimulus onset asynchronies (SOAs) using the method of constant stimuli. Participants made unspeeded temporal order judgments (TOJs) regarding which stream (auditory or visual) appeared to have been presented first. Temporal discrimination accuracy was significantly better for the object actions than for the speech video clips, and both were significantly better than for the music video clips. In order to investigate whether or not these differences in TOJ performance were driven by differences in stimulus familiarity, we conducted a second experiment using brief speech (syllables), music (guitar), and object action video clips of fixed duration together with temporally reversed (i.e., less familiar) versions of the same stimuli. The results showed no main effect of stimulus type on temporal discrimination accuracy. Interestingly, however, reversing the video clips resulted in a significant decrement in temporal discrimination accuracy as compared to the normally presented for the music and object actions clips, but not for the speech stimuli. Overall, our results suggest that cross-modal temporal discrimination performance is better for audiovisual stimuli of lower complexity as compared to stimuli having continuously varying properties (e.g., syllables versus words and/or sentences).