Purpose: This study aims to evaluate the effect of auditory neuropathy spectrum disorder (ANSD) on postoperative auditory perception and listening difficulties in pediatric cochlear implant (CI) recipients.
Method: The Children's Auditory Perception Test (CAPT) assesses auditory perception skills, and the Children's Home Inventory of Listening Difficulties (CHILD) Scale evaluates daily listening difficulties. The study involved pediatric CI recipients ( = 40) aged between 5 and 7 years, with and without diagnosis of ANSD. The research ensured homogeneity across various factors, including chronological age, age at diagnosis, age at initial implantation, bilateral simultaneous surgery, etiologies of hearing loss, and family education level.
Results: The findings have demonstrated that children without ANSD exhibited better performance in integrating visual-auditory stimuli and overall listening performance, distant sound source scores, and noisy environment scores (respectively = .047, = .001, = .028, and = .010). Additionally, children with better speech perception also have a better ability to integrate audiovisual stimuli ( = .005, = .438).
Conclusions: There are significant differences in postoperative listening skills and auditory perceptions between children with and without an ANSD who have CIs. Accordingly, children without an ANSD perform better.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1044/2024_AJA-24-00168 | DOI Listing |
Exp Brain Res
December 2024
Music and Audio Research Laboratory, New York University, New York, USA.
We examined the impact of auditory stimuli and their methods on a dynamic balance task performance. Twenty-four young adults wore an HTC Vive headset and dodged a virtual ball to the right or left based on its color (blue to the left, red to the right, and vice versa). We manipulated the environment by introducing congruent (auditory stimuli from the correct direction) or incongruent (auditory stimuli played randomly from either side) and comparing a multimodal (visual and congruent auditory stimuli) to unimodal (visual or auditory stimuli) presentation.
View Article and Find Full Text PDFInt J Pediatr Otorhinolaryngol
December 2024
Department of Audiology, University of Social Welfare and Rehabilitation Sciences, Tehran, Iran. Electronic address:
Background: Auditory attention is an important cognitive factor that significantly affects speech perception in noisy environments. Hearing loss can impact attention, and it can impair speech perception in noise. Auditory attention training improves speech perception in noise in children with hearing loss.
View Article and Find Full Text PDFJ Exp Child Psychol
December 2024
Child Psychopathology Unit, Scientific Institute, 23842 Bosisio Parini, Lecco, Italy.
The ability to process auditory information is one of the foundations of the ability to appropriately acquire language. Moreover, early difficulties in basic auditory abilities have cascading effects on the appropriate wiring of brain networks underlying higher-order linguistic processes. Language impairments represent core difficulties in two different but partially overlapping disorders: developmental language disorder (DLD) and autism spectrum disorder (ASD).
View Article and Find Full Text PDFEar Hear
November 2024
Department of Speech Language Pathology & Audiology, Towson University, Towson, Maryland, USA.
Objectives: Musicians face an increased risk of hearing loss due to prolonged and repetitive exposure to high-noise levels. Detecting early signs of hearing loss, which are subtle and often elusive to traditional clinical tests like pure-tone audiometry, is essential. The objective of this study was to investigate the impact of noise exposure on the electrophysiological and perceptual aspects of subclinical hearing damage in young musicians with normal audiometric thresholds.
View Article and Find Full Text PDFSci Adv
December 2024
Aix Marseille Université, INSERM, INS, Institut de Neurosciences des Systèmes, Marseille, France.
Dynamical theories of speech processing propose that the auditory cortex parses acoustic information in parallel at the syllabic and phonemic timescales. We developed a paradigm to independently manipulate both linguistic timescales, and acquired intracranial recordings from 11 patients who are epileptic listening to French sentences. Our results indicate that (i) syllabic and phonemic timescales are both reflected in the acoustic spectral flux; (ii) during comprehension, the auditory cortex tracks the syllabic timescale in the theta range, while neural activity in the alpha-beta range phase locks to the phonemic timescale; (iii) these neural dynamics occur simultaneously and share a joint spatial location; (iv) the spectral flux embeds two timescales-in the theta and low-beta ranges-across 17 natural languages.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!