Natural conversation is multisensory: when we can see the speaker's face, visual speech cues improve our comprehension. The neuronal mechanisms underlying this phenomenon remain unclear. The two main alternatives are visually mediated phase modulation of neuronal oscillations (excitability fluctuations) in auditory neurons and visual input-evoked responses in auditory neurons. Investigating this question using naturalistic audiovisual speech with intracranial recordings in humans of both sexes, we find evidence for both mechanisms. Remarkably, auditory cortical neurons track the temporal dynamics of purely visual speech using the phase of their slow oscillations and phase-related modulations in broadband high-frequency activity. Consistent with known perceptual enhancement effects, the visual phase reset amplifies the cortical representation of concomitant auditory speech. In contrast to this, and in line with earlier reports, visual input reduces the amplitude of evoked responses to concomitant auditory input. We interpret the combination of improved phase tracking and reduced response amplitude as evidence for more efficient and reliable stimulus processing in the presence of congruent auditory and visual speech inputs. Watching the speaker can facilitate our understanding of what is being said. The mechanisms responsible for this influence of visual cues on the processing of speech remain incompletely understood. We studied these mechanisms by recording the electrical activity of the human brain through electrodes implanted surgically inside the brain. We found that visual inputs can operate by directly activating auditory cortical areas, and also indirectly by modulating the strength of cortical responses to auditory input. Our results help to understand the mechanisms by which the brain merges auditory and visual speech into a unitary perception.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7605423PMC
http://dx.doi.org/10.1523/JNEUROSCI.0555-20.2020DOI Listing

Publication Analysis

Top Keywords

visual speech
20
visual
10
auditory
10
phase reset
8
evoked responses
8
influence visual
8
speech
8
auditory neurons
8
responses auditory
8
auditory cortical
8

Similar Publications

Objective: To evaluate the effectiveness of complex rehabilitation measures using the drug Cortexin in children with neuropsychiatric pathology during a one-year follow-up.

Material And Methods: A promising dynamic examination and treatment of 323 children with neuropsychiatric pathology from the age of 7 days to 1 year, age 3.2±1.

View Article and Find Full Text PDF

Background: The regulatory role of the apolipoprotein E (APOE) ε4 allele in the clinical manifestations of spinocerebellar ataxia type 3 (SCA3) remains unclear. This study aimed to evaluate the impact of the APOE ε4 allele on cognitive and motor functions in SCA3 patients.

Methods: This study included 281 unrelated SCA3 patients and 182 controls.

View Article and Find Full Text PDF

A non-local dual-stream fusion network for laryngoscope recognition.

Am J Otolaryngol

December 2024

Department of Otorhinolaryngology Head and Neck Surgery, Tianjin First Central Hospital, Tianjin 300192, China; Institute of Otolaryngology of Tianjin, Tianjin, China; Key Laboratory of Auditory Speech and Balance Medicine, Tianjin, China; Key Clinical Discipline of Tianjin (Otolaryngology), Tianjin, China; Otolaryngology Clinical Quality Control Centre, Tianjin, China.

Purpose: To use deep learning technology to design and implement a model that can automatically classify laryngoscope images and assist doctors in diagnosing laryngeal diseases.

Materials And Methods: The experiment was based on 3057 images (normal, glottic cancer, granuloma, Reinke's Edema, vocal cord cyst, leukoplakia, nodules and polyps) from the dataset Laryngoscope8. A classification model based on deep neural networks was developed and tested.

View Article and Find Full Text PDF

Cochlear implantation is a well-established method for restoring hearing sensation in individuals with severe to profound hearing loss. It significantly improves verbal communication for many users, despite substantial variability in patients' reports and performance on speech perception tests and quality-of-life outcome measures. Such variability in outcome measures remains several years after implantation and could reflect difficulties in attentional regulation.

View Article and Find Full Text PDF

Familial hemiplegic migraine type 2 results from pathogenic variants in the gene, which encodes for a catalytic subunit of sodium/potassium ATPase. This extremely rare autosomal dominant disorder manifests with a spectrum of symptoms, most commonly pure hemiplegic phenotype, epilepsy, and/or intellectual disability. In this study, we detail the clinical features and genetic analysis of nine patients from a large family spanning four generations, with all carrying a previously unreported likely pathogenic variant, p.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!