To derive meaning from sound, the brain must integrate information across many timescales. What computations underlie multiscale integration in human auditory cortex? Evidence suggests that auditory cortex analyses sound using both generic acoustic representations (for example, spectrotemporal modulation tuning) and category-specific computations, but the timescales over which these putatively distinct computations integrate remain unclear. To answer this question, we developed a general method to estimate sensory integration windows-the time window when stimuli alter the neural response-and applied our method to intracranial recordings from neurosurgical patients. We show that human auditory cortex integrates hierarchically across diverse timescales spanning from ~50 to 400 ms. Moreover, we find that neural populations with short and long integration windows exhibit distinct functional properties: short-integration electrodes (less than ~200 ms) show prominent spectrotemporal modulation selectivity, while long-integration electrodes (greater than ~200 ms) show prominent category selectivity. These findings reveal how multiscale integration organizes auditory computation in the human brain.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8957490 | PMC |
http://dx.doi.org/10.1038/s41562-021-01261-y | DOI Listing |
Sci Total Environ
January 2025
Guangdong Province Hospital for Occupational Disease Prevention and Treatment, Guangzhou, Guangdong, China; Guangdong Medical University, Dongguan, Guangdong, China; Guangdong Pharmaceutical University, Guangzhou, Guangdong, China; Southern Medical University, Guangzhou, Guangdong, China; JI NAN University, Guangdong, China. Electronic address:
Background: Noise is a threat to human auditory system, hearing protection devices (HPDs) are widely used to prevent noise-induced hearing loss (NIHL). However, the role of wearing HPDs on NIHL and the complex relationship between them are still unclear. This study aims to explore such relationship and identify the associated influencing pathways.
View Article and Find Full Text PDFSci Rep
January 2025
Department of Otorhinolaryngology, University of Pennsylvania, Philadelphia, PA, USA.
Auditory perception requires categorizing sound sequences, such as speech or music, into classes, such as syllables or notes. Auditory categorization depends not only on the acoustic waveform, but also on variability and uncertainty in how the listener perceives the sound - including sensory and stimulus uncertainty, the listener's estimated relevance of the particular sound to the task, and their ability to learn the past statistics of the acoustic environment. Whereas these factors have been studied in isolation, whether and how these factors interact to shape categorization remains unknown.
View Article and Find Full Text PDFSci Rep
January 2025
Acoustics Research Centre, University of Salford, The Crescent, Manchester, M5 4WT, UK.
It is well understood that a significant shift away from fossil fuel based transportation is necessary to limit the impacts of the climate crisis. Electric micromobility modes, such as electric scooters and electric bikes, have the potential to offer a lower-emission alternative to journeys made with internal combustion engine vehicles, and such modes of transport are becoming increasingly commonplace on our streets. Although offering advantages such as reduced air pollution and greater personal mobility, the widespread approval and uptake of electric micromobility is not without its challenges.
View Article and Find Full Text PDFAnn Fam Med
January 2025
Clinical Skills Education Centre, Queen's University Belfast, Northern Ireland, United Kingdom.
There is a hum and drum to the clinical day, sounds and rhythms that pervade physician and patient's soundscape. We hear but we do not listen. The soundtrack of the daily grind is experienced as an audio blanket of white noise.
View Article and Find Full Text PDFJ Exp Psychol Gen
January 2025
Department of Experimental Psychology, Helmholtz Institute, Utrecht University.
Predicting the location of moving objects in noisy environments is essential to everyday behavior, like when participating in traffic. Although many objects provide multisensory information, it remains unknown how humans use multisensory information to localize moving objects, and how this depends on expected sensory interference (e.g.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!