In a multi-speaker scenario, the human auditory system is able to attend to one particular speaker of interest and ignore the others. It has been demonstrated that it is possible to use electroencephalography (EEG) signals to infer to which speaker someone is attending by relating the neural activity to the speech signals. However, classifying auditory attention within a short time interval remains the main challenge. We present a convolutional neural network-based approach to extract the locus of auditory attention (left/right) without knowledge of the speech envelopes. Our results show that it is possible to decode the locus of attention within 1-2 s, with a median accuracy of around 81%. These results are promising for neuro-steered noise suppression in hearing aids, in particular in scenarios where per-speaker envelopes are unavailable.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8143791PMC
http://dx.doi.org/10.7554/eLife.56481DOI Listing

Publication Analysis

Top Keywords

auditory attention
12
locus auditory
8
convolutional neural
8
eeg-based detection
4
detection locus
4
auditory
4
attention
4
attention convolutional
4
neural networks
4
networks multi-speaker
4

Similar Publications

Objectives: One important aspect in facilitating language access for children with hearing loss (HL) is the auditory environment. An optimal auditory environment is characterized by high signal to noise ratios (SNRs), low background noise levels, and low reverberation times. In this study, the authors describe the auditory environment of early intervention groups specifically equipped for young children with HL.

View Article and Find Full Text PDF

Background: With the aging of the population, the deterioration of visual and auditory functions amongst the elderly has attracted much attention. Age-related hearing loss (ARHL) and age-related macular degeneration (AMD) are common eye and ear diseases that seriously affect the quality of life of elderly population.

Methods: This study utilised a whole cohort sampling method, with a total of 713 participants aged 50 years and older in the community from June 2022 to October 2023, resulting in the inclusion of 620 participants.

View Article and Find Full Text PDF

Task-irrelevant sounds that are semantically congruent with the target can facilitate performance in visual search tasks, resulting in faster search times. In three experiments, we tested the underlying processes of this effect. Participants were presented with auditory primes that were semantically congruent, neutral, or incongruent to the visual search target, and importantly, we varied the set size of the search displays.

View Article and Find Full Text PDF

Biomarkers.

Alzheimers Dement

December 2024

Department of Neurobiology, Care Sciences and Society, Center for Alzheimer Research, Division of Clinical Geriatrics, Karolinska Institutet, Stockholm, Sweden.

Background: [F]FDG PET is essential since it allows us to differentiate between different dementia disorders/types, revealing distinct neurodegenerative patterns in those predisposed to the condition. Individuals with Autosomal Dominant Alzheimer's Disease (ADAD) have a predictable age of onset, enabling the study of cognitive and pathological changes before clinical manifestation. Our objective was to investigate temporal course and regional links between cognition and glucose metabolism as a measure of early synaptic impairment in ADAD.

View Article and Find Full Text PDF

Biomarkers.

Alzheimers Dement

December 2024

Centre for Brain Research, Indian Institute of Science, Bangalore, Karnataka, India.

Background: Thyroid disorders is one of the most common endocrine disorders. It is estimated that 42 million people suffer from thyroid disorders in India. The imbalance in thyroid hormone levels can significantly impact cognitive health of older population.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!