The cochlea behaves like a bank of band-pass filters, segregating information into different frequency channels. Some aspects of perception reflect processing within individual channels, but others involve the integration of information across them. One instance of this is sound localization, which improves with increasing bandwidth. The processing of binaural cues for sound location has been studied extensively. However, although the advantage conferred by bandwidth is clear, we currently know little about how this additional information is combined to form our percept of space. We investigated the ability of cells in the auditory system of guinea pigs to compare interaural level differences (ILDs), a key localization cue, between tones of disparate frequencies in each ear. Cells in auditory cortex believed to be integral to ILD processing (excitatory from one ear, inhibitory from the other: EI cells) compare ILDs separately over restricted frequency ranges which are not consistent with their monaural tuning. In contrast, cells that are excitatory from both ears (EE cells) show no evidence of frequency-specific processing. Both cell types are explained by a model in which ILDs are computed within separate frequency channels and subsequently combined in a single cortical cell. Interestingly, ILD processing in all inferior colliculus cell types (EE and EI) is largely consistent with processing within single, matched-frequency channels from each ear. Our data suggest a clear constraint on the way that localization cues are integrated: cortical ILD tuning to broadband sounds is a composite of separate, frequency-specific, binaurally sensitive channels. This frequency-specific processing appears after the level of the midbrain. For some sensory modalities (e.g., somatosensation, vision), the spatial arrangement of the outside world is inherited by the brain from the periphery. The auditory periphery is arranged spatially by frequency, not spatial location. Therefore, our auditory perception of location must be synthesized from physical cues in separate frequency channels. There are multiple cues (e.g., timing, level, spectral cues), but even single cues (e.g., level differences) are frequency dependent. The synthesis of location must account for this frequency dependence, but it is not known how this might occur. Here, we investigated how interaural-level differences are combined across frequency along the ascending auditory system. We found that the integration in auditory cortex preserves the independence of the different-level cues in different frequency regions.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5511886 | PMC |
http://dx.doi.org/10.1523/JNEUROSCI.3034-16.2017 | DOI Listing |
Cogn Neurodyn
December 2025
Department of Psychology, Graduate School of Humanities, Kobe University, 1-1 Rokkodai- cho, Nada, Kobe, 657-8501 Japan.
Unlabelled: The integration of auditory and visual stimuli is essential for effective language processing and social perception. The present study aimed to elucidate the mechanisms underlying audio-visual (A-V) integration by investigating the temporal dynamics of multisensory regions in the human brain. Specifically, we evaluated inter-trial coherence (ITC), a neural index indicative of phase resetting, through scalp electroencephalography (EEG) while participants performed a temporal-order judgment task that involved auditory (beep, A) and visual (flash, V) stimuli.
View Article and Find Full Text PDFClin Psychopharmacol Neurosci
February 2025
Department of Psychiatry, National Institute of Mental Health and Neuro Sciences (NIMHANS), Bengaluru, India.
Auditory/visual hallucinations and perceptual anomalies are one of the core symptoms experienced by patients with schizophrenia. Studies have implicated lateral occipital cortex (LOC) as one of the areas to be aberrantly functioning in schizophrenia, possibly associated with the auditory/visual symptoms of schizophrenia. Here we report of a case of a 29-year-old female diagnosed with treatment resistant schizophrenia on clozapine with persistent auditory verbal hallucinations (AVH) and visual anomalies.
View Article and Find Full Text PDFJ Neurophysiol
January 2025
Department of Physiology, McGill University, Montreal, Quebec H3G 1Y6 Canada.
The loss of a sensory modality triggers a phenomenon known as cross-modal plasticity, where areas of the brain responsible for the lost sensory modality are reorganized and repurposed to the benefit of the remaining modalities. After perinatal or congenital deafness, superior visual motion detection abilities have been psychophysically identified in both humans and cats, and this advantage has been causally demonstrated to be mediated by reorganized auditory cortex. In our study, we investigated visually evoked potentials (VEPs) in response to motion-onset stimuli of varying speeds in both hearing and perinatally deafened cats under light anesthesia.
View Article and Find Full Text PDFPain Rep
February 2025
Department of Occupational Therapy, Graduate School of Rehabilitation Science, Osaka Metropolitan University, Osaka, Japan.
Introduction: Chronic low back pain (CLBP) is a global health issue, and its nonspecific causes make treatment challenging. Understanding the neural mechanisms of CLBP should contribute to developing effective therapies.
Objectives: To compare current source density (CSD) and functional connectivity (FC) extracted from resting electroencephalography (EEG) between patients with CLBP and healthy controls and to examine the correlations between EEG indices and symptoms.
Mol Neurobiol
January 2025
Department of Physiology, Hamidiye Faculty of Medicine, University of Health Sciences, Istanbul, Turkey.
This study aimed to investigate the impact of early childhood chronic stress on the development of the brain extracellular matrix (ECM) and how alterations in the ECM following early-life adversity (ELA) affect auditory learning and cognitive flexibility. ELA was induced through a combination of maternal separation and neonatal isolation in male Sprague-Dawley rats, and the success of the ELA model was assessed behaviorally and biochemically. A cortex-dependent go/no-go task with two phases was used to determine the impact of ELA on auditory learning and cognitive flexibility.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!