Speech perception is influenced by vision through a process of audiovisual integration. This is demonstrated by the McGurk illusion where visual speech (for example /ga/) dubbed with incongruent auditory speech (such as /ba/) leads to a modified auditory percept (/da/). Recent studies have indicated that perception of the incongruent speech stimuli used in McGurk paradigms involves mechanisms of both general and audiovisual speech specific mismatch processing and that general mismatch processing modulates induced theta-band (4-8 Hz) oscillations. Here, we investigated whether the theta modulation merely reflects mismatch processing or, alternatively, audiovisual integration of speech. We used electroencephalographic recordings from two previously published studies using audiovisual sine-wave speech (SWS), a spectrally degraded speech signal sounding nonsensical to naïve perceivers but perceived as speech by informed subjects. Earlier studies have shown that informed, but not naïve subjects integrate SWS phonetically with visual speech. In an N1/P2 event-related potential paradigm, we found a significant difference in theta-band activity between informed and naïve perceivers of audiovisual speech, suggesting that audiovisual integration modulates induced theta-band oscillations. In a McGurk mismatch negativity paradigm (MMN) where infrequent McGurk stimuli were embedded in a sequence of frequent audio-visually congruent stimuli we found no difference between congruent and McGurk stimuli. The infrequent stimuli in this paradigm are violating both the general prediction of stimulus content, and that of audiovisual congruence. Hence, we found no support for the hypothesis that audiovisual mismatch modulates induced theta-band oscillations. We also did not find any effects of audiovisual integration in the MMN paradigm, possibly due to the experimental design.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6634411PMC
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0219744PLOS

Publication Analysis

Top Keywords

audiovisual integration
20
modulates induced
16
induced theta-band
16
theta-band oscillations
12
mismatch processing
12
speech
11
audiovisual
9
integration modulates
8
visual speech
8
audiovisual speech
8

Similar Publications

Multimodal MRI analysis of microstructural and functional connectivity brain changes following systematic audio-visual training in a virtual environment.

Neuroimage

December 2024

Institute of Population Health, University of Liverpool, United Kingdom; Hanse Wissenschaftskolleg, Delmenhorst, Germany. Electronic address:

Recent work has shown rapid microstructural brain changes in response to learning new tasks. These cognitive tasks tend to draw on multiple brain regions connected by white matter (WM) tracts. Therefore, behavioural performance change is likely to be the result of microstructural, functional activation, and connectivity changes in extended neural networks.

View Article and Find Full Text PDF

The power of sound: Exploring the auditory influence on visual search efficiency.

Cognition

December 2024

School of Psychology, Liaoning Collaborative Innovation Center of Children and Adolescents Healthy Personality Assessment and Cultivation, Liaoning Normal University, Dalian 116029, China; School of Foreign Languages, Ningbo University of Technology, Ningbo 315211, China. Electronic address:

In a dynamic visual search environment, a synchronous and meaningless auditory signal (pip) that corresponds with a change in a visual target promotes the efficiency of visual search (pop out), which is known as the pip-and-pop effect. We conducted three experiments to investigate the mechanism of the pip-and-pop effect. Using the eye movement technique, we manipulated the interval rhythm (Exp.

View Article and Find Full Text PDF

Neural correlates of audiovisual integration in schizophrenia - an ERP study.

Front Psychiatry

December 2024

Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hannover, Germany.

Introduction: Multisensory integration (MSI) enhances perception by combining information from different sensory modalities. In schizophrenia, individuals often exhibit impaired audiovisual processing, resulting in broader temporal binding windows (TBWs) which appear to be associated with symptom severity. Since the underlying mechanisms of these aberrations are not yet fully understood, the present study aims to investigate multisensory processing in schizophrenia in more detail.

View Article and Find Full Text PDF

Exogenous spatial attention attenuates audiovisual integration (AVI). Previous studies on the effects of exogenous spatial attention on AVI have focused on the inhibition of return (IOR) effect induced by visual cues and the facilitation effect induced by auditory cues, but the differences between the effects of exogenous spatial attention (induced by visual and auditory cues) on AVI remain unclear. The present study used the exogenous spatial cue-target paradigm and manipulated cue stimulus modality (visual cue, auditory cue) in two experiments (Experiment 1: facilitation effect; Experiment 2: IOR effect) to examine the effects of exogenous spatial attention (evoked by cues in different modalities) on AVI.

View Article and Find Full Text PDF

Objectives: Speech intelligibility is supported by the sound of a talker's voice and visual cues related to articulatory movements. The relative contribution of auditory and visual cues to an integrated audiovisual percept varies depending on a listener's environment and sensory acuity. Cochlear implant users rely more on visual cues than those with acoustic hearing to help compensate for the fact that the auditory signal produced by their implant is poorly resolved relative to that of the typically developed cochlea.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!