Multisensory integration of speech signals: the relationship between space and time.

Exp Brain Res

Centre for Cognitive Neuroscience, Wilfrid Laurier University, Waterloo, ON, Canada.

Published: October 2006

Integrating audiovisual cues for simple events is affected when sources are separated in space and time. By contrast, audiovisual perception of speech appears resilient when either spatial or temporal disparities exist. We investigated whether speech perception is sensitive to the combination of spatial and temporal inconsistencies. Participants heard the bisyllable /aba/ while seeing a face produce the incongruent bisyllable /ava/. We tested the level of visual influence over auditory perception when the sound was asynchronous with respect to facial motion (from -360 to +360 ms) and emanated from five locations equidistant to the participant. Although an interaction was observed, it was not related to participants' perception of synchrony, nor did it indicate a linear relationship between the effect of spatial and temporal discrepancies. We conclude that either the complexity of the signal or the nature of the task reduces reliance on spatial and temporal contiguity for audiovisual speech perception.

Download full-text PDF

Source
http://dx.doi.org/10.1007/s00221-006-0634-0DOI Listing

Publication Analysis

Top Keywords

spatial temporal
16
space time
8
speech perception
8
perception
5
multisensory integration
4
speech
4
integration speech
4
speech signals
4
signals relationship
4
relationship space
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!