Speech perception is influenced by vision through a process of audiovisual integration. This is demonstrated by the McGurk illusion where visual speech (for example /ga/) dubbed with incongruent auditory speech (such as /ba/) leads to a modified auditory percept (/da/). Recent studies have indicated that perception of the incongruent speech stimuli used in McGurk paradigms involves mechanisms of both general and audiovisual speech specific mismatch processing and that general mismatch processing modulates induced theta-band (4-8 Hz) oscillations. Here, we investigated whether the theta modulation merely reflects mismatch processing or, alternatively, audiovisual integration of speech. We used electroencephalographic recordings from two previously published studies using audiovisual sine-wave speech (SWS), a spectrally degraded speech signal sounding nonsensical to naïve perceivers but perceived as speech by informed subjects. Earlier studies have shown that informed, but not naïve subjects integrate SWS phonetically with visual speech. In an N1/P2 event-related potential paradigm, we found a significant difference in theta-band activity between informed and naïve perceivers of audiovisual speech, suggesting that audiovisual integration modulates induced theta-band oscillations. In a McGurk mismatch negativity paradigm (MMN) where infrequent McGurk stimuli were embedded in a sequence of frequent audio-visually congruent stimuli we found no difference between congruent and McGurk stimuli. The infrequent stimuli in this paradigm are violating both the general prediction of stimulus content, and that of audiovisual congruence. Hence, we found no support for the hypothesis that audiovisual mismatch modulates induced theta-band oscillations. We also did not find any effects of audiovisual integration in the MMN paradigm, possibly due to the experimental design.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6634411 | PMC |
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0219744 | PLOS |
Neuroimage
December 2024
Institute of Population Health, University of Liverpool, United Kingdom; Hanse Wissenschaftskolleg, Delmenhorst, Germany. Electronic address:
Recent work has shown rapid microstructural brain changes in response to learning new tasks. These cognitive tasks tend to draw on multiple brain regions connected by white matter (WM) tracts. Therefore, behavioural performance change is likely to be the result of microstructural, functional activation, and connectivity changes in extended neural networks.
View Article and Find Full Text PDFCognition
December 2024
School of Psychology, Liaoning Collaborative Innovation Center of Children and Adolescents Healthy Personality Assessment and Cultivation, Liaoning Normal University, Dalian 116029, China; School of Foreign Languages, Ningbo University of Technology, Ningbo 315211, China. Electronic address:
In a dynamic visual search environment, a synchronous and meaningless auditory signal (pip) that corresponds with a change in a visual target promotes the efficiency of visual search (pop out), which is known as the pip-and-pop effect. We conducted three experiments to investigate the mechanism of the pip-and-pop effect. Using the eye movement technique, we manipulated the interval rhythm (Exp.
View Article and Find Full Text PDFFront Psychiatry
December 2024
Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hannover, Germany.
Introduction: Multisensory integration (MSI) enhances perception by combining information from different sensory modalities. In schizophrenia, individuals often exhibit impaired audiovisual processing, resulting in broader temporal binding windows (TBWs) which appear to be associated with symptom severity. Since the underlying mechanisms of these aberrations are not yet fully understood, the present study aims to investigate multisensory processing in schizophrenia in more detail.
View Article and Find Full Text PDFExp Brain Res
December 2024
Department of Psychology, Northeast Normal University, Changchun, People's Republic of China.
Exogenous spatial attention attenuates audiovisual integration (AVI). Previous studies on the effects of exogenous spatial attention on AVI have focused on the inhibition of return (IOR) effect induced by visual cues and the facilitation effect induced by auditory cues, but the differences between the effects of exogenous spatial attention (induced by visual and auditory cues) on AVI remain unclear. The present study used the exogenous spatial cue-target paradigm and manipulated cue stimulus modality (visual cue, auditory cue) in two experiments (Experiment 1: facilitation effect; Experiment 2: IOR effect) to examine the effects of exogenous spatial attention (evoked by cues in different modalities) on AVI.
View Article and Find Full Text PDFEar Hear
December 2024
Department of Psychology, University of Western Ontario, London, Ontario, Canada.
Objectives: Speech intelligibility is supported by the sound of a talker's voice and visual cues related to articulatory movements. The relative contribution of auditory and visual cues to an integrated audiovisual percept varies depending on a listener's environment and sensory acuity. Cochlear implant users rely more on visual cues than those with acoustic hearing to help compensate for the fact that the auditory signal produced by their implant is poorly resolved relative to that of the typically developed cochlea.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!