Computer models of the auditory periphery provide a tool for -formulating theories concerning the relationship between the physiology of the auditory system and the perception of sounds both in normal and impaired hearing. However, the time-consuming nature of their construction constitutes a major impediment to their use, and it is important that transparent models be available on an 'off-the-shelf' basis to researchers. The MATLAB Auditory Periphery (MAP) model aims to meet these requirements and be freely available. The model can be used to simulate simple psychophysical tasks such as absolute threshold, pitch matching and forward masking and those used to measure compression and frequency selectivity. It can be used as a front end to automatic speech recognisers for the study of speech in quiet and in noise. The model can also simulate theories of hearing impairment and be used to make predictions about the efficacy of hearing aids. The use of the software will be described along with illustrations of its application in the study of the psychology of hearing.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1007/978-1-4614-1590-9_2 | DOI Listing |
J Neurosci
January 2025
Oregon Hearing Research Center, Oregon Health and Science University, Portland, OR 97239, USA
In everyday hearing, listeners face the challenge of understanding behaviorally relevant foreground stimuli (speech, vocalizations) in complex backgrounds (environmental, mechanical noise). Prior studies have shown that high-order areas of human auditory cortex (AC) pre-attentively form an enhanced representation of foreground stimuli in the presence of background noise. This enhancement requires identifying and grouping the features that comprise the background so they can be removed from the foreground representation.
View Article and Find Full Text PDFRes Sq
December 2024
Department of Biology, Indiana University, Indianapolis, IN.
Neuroimage
January 2025
Department of Otolaryngology, Head and Neck, University of Tübingen, Tübingen 72076, Germany. Electronic address:
The slowing and reduction of auditory responses in the brain are recognized side effects of increased pure tone thresholds, impaired speech recognition, and aging. However, it remains controversial whether central slowing is primarily linked to brain processes as atrophy, or is also associated with the slowing of temporal neural processing from the periphery. Here we analyzed electroencephalogram (EEG) responses that most likely reflect medial geniculate body (MGB) responses to passive listening of phonemes in 80 subjects ranging in age from 18 to 76 years, in whom the peripheral auditory responses had been analyzed in detail (Schirmer et al.
View Article and Find Full Text PDFPain
November 2024
State Key Laboratory of Chemical Biology, Shanghai Institute of Materia Medica, Chinese Academy of Science, Shanghai, China.
Voltage-gated potassium channel subfamily q member 4 (Kcnq4) is predominantly expressed by hair cells and auditory neurons and regulates the neuronal excitability in the auditory pathway. Although it is further detected in myelinated large-diameter dorsal root ganglia (DRG) neurons in the periphery, the expression and function of Kcnq4 channel in nociceptors remains unknown. Here we showed that Kcnq4 is substantially expressed by unmyelinated small-diameter DRG neurons in both human and mouse.
View Article and Find Full Text PDFConscious Cogn
October 2024
Biology and Biological Engineering, California Institute of Technology, Pasadena, CA, USA; Computation and Neural Systems, California Institute of Technology, Pasadena, CA, USA.
The current study asked whether impoverished peripheral vision led to perception immune from word-based semantic influences. We leveraged a peripheral sound-induced flash illusion. In each trial, two or three Mandarin characters were flashed quickly in the periphery with number-congruent or -incongruent beeps.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!