The combined selectivity for amplitude modulation frequency (AMF) and interaural time difference (ITD) was investigated for single units in the auditory midbrain of the grassfrog. Stimuli were presented by means of a closed sound system. A large number of units was found to be selective for AMF (95%) or ITD (85%) and mostly, these selectivities were intricately coupled. At zero ITD most units showed a band-pass (54%) or bimodal (24%) AMF-rate histogram. At an AMF of 36 Hz, which is equal to the pulse repetition rate of the mating call, 70% of the units possessed an asymmetrical ITD-rate histogram, whereas about 15% showed a symmetrically peaked histogram. With binaural stimulation more units appeared to be selective for AMF (95%) as was the case with monaural stimulation (85%). A large fraction of the units appeared to be most selective for ITD at AMFs of 36 and 72 Hz, whereas units seldomly exhibited ITD selectivity with unmodulated tones. Based upon previous papers (Melssen et al., 1990; Van Stokkum, 1990) a binaural model is proposed to explain these findings. An auditory midbrain neuron is modelled as a third order neuron which receives excitatory input from second order neurons. Furthermore the model neuron receives inputs from the other ear, which may be either excitatory or inhibitory. Spatiotemporal integration of inputs from both ears, followed by action potential generation, produces a combined selectivity for AMF and ITD. In particular the responses of an experimentally observed EI neuron to a set of stimuli are reproduced well by the model.

Download full-text PDF

Source
http://dx.doi.org/10.1016/0378-5955(91)90192-cDOI Listing

Publication Analysis

Top Keywords

auditory midbrain
12
combined selectivity
8
selective amf
8
amf 95%
8
units appeared
8
appeared selective
8
neuron receives
8
units
7
amf
5
measuring modelling
4

Similar Publications

Neural correlates of perceptual plasticity in the auditory midbrain and thalamus.

J Neurosci

January 2025

Neuroscience and Cognitive Science Program, University of Maryland, College Park, Maryland, 20742.

Hearing is an active process in which listeners must detect and identify sounds, segregate and discriminate stimulus features, and extract their behavioral relevance. Adaptive changes in sound detection can emerge rapidly, during sudden shifts in acoustic or environmental context, or more slowly as a result of practice. Although we know that context- and learning-dependent changes in the sensitivity of auditory cortical (ACX) neurons support many aspects of perceptual plasticity, the contribution of subcortical auditory regions to this process is less understood.

View Article and Find Full Text PDF

Profile-analysis experiments measure the ability to discriminate complex sounds based on patterns, or profiles, in their amplitude spectra. Studies of profile analysis have focused on normal-hearing listeners and target frequencies near 1 kHz. To provide more insight into underlying mechanisms, we studied profile analysis over a large target frequency range (0.

View Article and Find Full Text PDF

The auditory midbrain mediates tactile vibration sensing.

Cell

January 2025

Department of Neurobiology, Harvard Medical School, 220 Longwood Avenue, Boston, MA 02115, USA; Howard Hughes Medical Institute, Harvard Medical School, 220 Longwood Avenue, Boston, MA 02115, USA. Electronic address:

Vibrations are ubiquitous in nature, shaping behavior across the animal kingdom. For mammals, mechanical vibrations acting on the body are detected by mechanoreceptors of the skin and deep tissues and processed by the somatosensory system, while sound waves traveling through air are captured by the cochlea and encoded in the auditory system. Here, we report that mechanical vibrations detected by the body's Pacinian corpuscle neurons, which are distinguished by their ability to entrain to high-frequency (40-1,000 Hz) environmental vibrations, are prominently encoded by neurons in the lateral cortex of the inferior colliculus (LCIC) of the midbrain.

View Article and Find Full Text PDF

Midbrain encodes sound detection behavior without auditory cortex.

Elife

December 2024

Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom.

Hearing involves analyzing the physical attributes of sounds and integrating the results of this analysis with other sensory, cognitive, and motor variables in order to guide adaptive behavior. The auditory cortex is considered crucial for the integration of acoustic and contextual information and is thought to share the resulting representations with subcortical auditory structures via its vast descending projections. By imaging cellular activity in the corticorecipient shell of the inferior colliculus of mice engaged in a sound detection task, we show that the majority of neurons encode information beyond the physical attributes of the stimulus and that the animals' behavior can be decoded from the activity of those neurons with a high degree of accuracy.

View Article and Find Full Text PDF

Vagus nerve stimulation (VNS) is a therapeutic intervention previously shown to enhance fear extinction in rats. VNS is approved for use in humans for the treatment of epilepsy, depression, and stroke, and it is currently under investigation as an adjuvant to exposure therapy in the treatment of PTSD. However, the mechanisms by which VNS enhances extinction of conditioned fear remain unresolved.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!