There is substantial variability in speech recognition ability across patients with cochlear implants (CIs), auditory brainstem implants (ABIs), and auditory midbrain implants (AMIs). To better understand how this variability is related to central processing differences, the current electroencephalography (EEG) study compared hearing abilities and auditory-cortex activation in patients with electrical stimulation at different sites of the auditory pathway. Three different groups of patients with auditory implants (Hannover Medical School; ABI: n = 6, CI: n = 6; AMI: n = 2) performed a speeded response task and a speech recognition test with auditory, visual, and audio-visual stimuli. Behavioral performance and cortical processing of auditory and audio-visual stimuli were compared between groups. ABI and AMI patients showed prolonged response times on auditory and audio-visual stimuli compared with NH listeners and CI patients. This was confirmed by prolonged N1 latencies and reduced N1 amplitudes in ABI and AMI patients. However, patients with central auditory implants showed a remarkable gain in performance when visual and auditory input was combined, in both speech and non-speech conditions, which was reflected by a strong visual modulation of auditory-cortex activation in these individuals. In sum, the results suggest that the behavioral improvement for audio-visual conditions in central auditory implant patients is based on enhanced audio-visual interactions in the auditory cortex. Their findings may provide important implications for the optimization of electrical stimulation and rehabilitation strategies in patients with central auditory prostheses. Hum Brain Mapp 38:2206-2225, 2017. © 2017 Wiley Periodicals, Inc.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6866801PMC
http://dx.doi.org/10.1002/hbm.23515DOI Listing

Publication Analysis

Top Keywords

auditory
15
auditory audio-visual
12
abi ami
12
audio-visual stimuli
12
central auditory
12
patients
10
patients cochlear
8
auditory brainstem
8
auditory midbrain
8
midbrain implants
8

Similar Publications

Development of the relationship between visual selective attention and auditory change detection.

Neuroimage

January 2025

State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, 100875, China. Electronic address:

Understanding the developmental trajectories of the auditory and visual systems is crucial to elucidate cognitive maturation and its associated relationships, which are essential for effectively navigating dynamic environments. Our one recent study has shown a positive correlation between the event-related potential (ERP) amplitudes associated with visual selective attention (posterior contralateral N2) and auditory change detection (mismatch negativity) in adults, suggesting an intimate relationship and potential shared mechanism between visual selective attention and auditory change detection. However, the evolution of these processes and their relationship over time remains unclear.

View Article and Find Full Text PDF

Introduction: This study aims to investigate the impact of auditory input on postural control in young adult cochlear implant users with profound sensorineural hearing loss. The research explores the relationship between auditory cues and static postural stability in individuals with hearing impairment.

Methods: 34 young adult cochlear implant users, consisting of 15 males and 19 females aged 18-35 years, underwent various balance tests, including the modified Clinical Tests of Sensory Interaction on Balance (mCTSIB) and the Unilateral Stance Test (UST), under different auditory conditions: (1) White noise stimulus present with the sound processor activated, (2) Ambient noise present with the sound processor activated, and (3) Sound processor deactivated.

View Article and Find Full Text PDF

Background: Alzheimer's disease (AD) can be diagnosed by in vivo abnormalities of amyloid-β plaques (A) and tau accumulation (T) biomarkers. Previous studies have shown that analyses of serial position performance in episodic memory tests, and especially, delayed primacy, are associated with AD pathology even in individuals who are cognitively unimpaired. The earliest signs of cortical tau pathology are observed in medial temporal lobe (MTL) regions, yet it is unknown if serial position markers are also associated with early tau load in these regions.

View Article and Find Full Text PDF

Speech processing involves a complex interplay between sensory and motor systems in the brain, essential for early language development. Recent studies have extended this sensory-motor interaction to visual word processing, emphasizing the connection between reading and handwriting during literacy acquisition. Here we show how language-motor areas encode motoric and sensory features of language stimuli during auditory and visual perception, using functional magnetic resonance imaging (fMRI) combined with representational similarity analysis.

View Article and Find Full Text PDF

Neural processing of auditory stimuli in rats: translational aspects using auditory oddball paradigms.

Behav Brain Res

January 2025

Department of Neurosurgery, Hannover Medical School, Carl-Neuberg-Straße 1, 30625 Hannover, Germany; Cluster of Excellence Hearing4all, German Research Foundation, Hannover, Germany; Center for Systems Neuroscience (ZSN) Hannover, 30559 Hannover, Germany.

Background: The three-class oddball paradigm allows to investigate the processing of behaviorally relevant and irrelevant auditory stimuli. In humans, event-related potentials (ERPs) are used as neural correlate of behavior. We recorded local field potentials (LFPs) within the medial prefrontal cortex (mPFC) in rats during three-class and passive two-class oddball paradigms and analyzed the ERPs focusing on similarities to human recordings.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!