Objective: This study aimed to evaluate whether auditory brainstem response (ABR) using a paired-click stimulation paradigm could serve as a tool for detecting cochlear synaptopathy (CS).
Methods: The ABRs to single-clicks and paired-clicks with various inter-click intervals (ICIs) and scores for word intelligibility in degraded listening conditions were obtained from 57 adults with normal hearing. The wave I peak amplitude and root mean square values for the post-wave I response within a range delayed from the wave I peak (referred to as the RMS) were calculated for the single- and second-click responses.
Natural sounds contain rich patterns of amplitude modulation (AM), which is one of the essential sound dimensions for auditory perception. The sensitivity of human hearing to AM measured by psychophysics takes diverse forms depending on the experimental conditions. Here, we address with a single framework the questions of why such patterns of AM sensitivity have emerged in the human auditory system and how they are realized by our neural mechanisms.
View Article and Find Full Text PDFIndividuals with autism spectrum disorders (ASD) are reported to exhibit degraded performance in sound localization. This study investigated whether the sensitivity to the interaural level differences (ILDs) and interaural time differences (ITDs), major cues for horizontal sound localization, are affected in ASD. Thresholds for discriminating the ILD and ITD were measured for adults with ASD and age- and IQ-matched controls in a lateralization experiment.
View Article and Find Full Text PDFAttention to the relevant object and space is the brain's strategy to effectively process the information of interest in complex environments with limited neural resources. Numerous studies have documented how attention is allocated in the visual domain, whereas the nature of attention in the auditory domain has been much less explored. Here, we show that the pupillary light response can serve as a physiological index of auditory attentional shift and can be used to probe the relationship between space-based and object-based attention as well.
View Article and Find Full Text PDFA dynamic neural network change, accompanied by cognitive shifts such as internal perceptual alternation in bistable stimuli, is reconciled by the discharge of noradrenergic locus coeruleus neurons. Transient pupil dilation as a consequence of the reconciliation with the neural network in bistable perception has been reported to precede the reported perceptual alternation. Here, we found that baseline pupil size, an index of temporal fluctuation of arousal level over a longer range of timescales than that for the transient pupil changes, relates to the frequency of perceptual alternation in auditory bistability.
View Article and Find Full Text PDFExpectations concerning the timing of a stimulus enhance attention at the time at which the event occurs, which confers significant sensory and behavioral benefits. Herein, we show that temporal expectations modulate even the sensory transduction in the auditory periphery via the descending pathway. We measured the medial olivocochlear reflex (MOCR), a sound-activated efferent feedback that controls outer hair cell motility and optimizes the dynamic range of the sensory system.
View Article and Find Full Text PDFWhen an amplitude modulated signal with a constant-frequency carrier is fed into a generic nonlinear amplifier, the phase of the carrier of the output signal is also modulated. This phenomenon is referred to as amplitude-modulation-to-phase-modulation (AM-to-PM) conversion and regarded as an unwanted signal distortion in the field of electro-communication engineering. Herein, we offer evidence that AM-to-PM conversion also occurs in the human cochlea and that listeners can use the PM information effectively to process the AM of sounds.
View Article and Find Full Text PDFIt is often assumed that the reaction time of a saccade toward visual and/or auditory stimuli reflects the sensitivities of our oculomotor-orienting system to stimulus saliency. Endogenous factors, as well as stimulus-related factors, would also affect the saccadic reaction time (SRT). However, it was not clear how these factors interact and to what extent visual and auditory-targeting saccades are accounted for by common mechanisms.
View Article and Find Full Text PDFAuditory frisson is the experience of feeling of cold or shivering related to sound in the absence of a physical cold stimulus. Multiple examples of frisson-inducing sounds have been reported, but the mechanism of auditory frisson remains elusive. Typical frisson-inducing sounds may contain a looming effect, in which a sound appears to approach the listener's peripersonal space.
View Article and Find Full Text PDFRecent studies using video-based eye tracking have presented accumulating evidence that postsaccadic oscillation defined in reference to the pupil center (PSOp) is larger than that to the iris center (PSOi). This indicates that the relative motion of the pupil reflects the viscoelasticity of the tissue of the iris. It is known that the pupil size controlled by the sphincter/dilator pupillae muscles reflects many aspects of cognition.
View Article and Find Full Text PDFJ Acoust Soc Am
September 2019
Some normal-hearing listeners report difficulties in speech perception in noisy environments, and the cause is not well understood. The present study explores the correlation between speech-in-noise reception performance and cochlear mechanical characteristics, which were evaluated using a principal component analysis of the otoacoustic emission (OAE) spectra. A principal component, specifically a characteristic dip at around 2-2.
View Article and Find Full Text PDFThe ability to track the statistics of our surroundings is a key computational challenge. A prominent theory proposes that the brain monitors for unexpected uncertainty - events which deviate substantially from model predictions, indicating model failure. Norepinephrine is thought to play a key role in this process by serving as an interrupt signal, initiating model-resetting.
View Article and Find Full Text PDFDespite the prevalent use of alerting sounds in alarms and human-machine interface systems and the long-hypothesized role of the auditory system as the brain's "early warning system," we have only a rudimentary understanding of what determines auditory salience-the automatic attraction of attention by sound-and which brain mechanisms underlie this process. A major roadblock has been the lack of a robust, objective means of quantifying sound-driven attentional capture. Here we demonstrate that: (1) a reliable salience scale can be obtained from crowd-sourcing ( = 911), (2) acoustic roughness appears to be a driving feature behind this scaling, consistent with previous reports implicating roughness in the perceptual distinctiveness of sounds, and (3) crowd-sourced auditory salience correlates with objective autonomic measures.
View Article and Find Full Text PDFThe auditory system converts the physical properties of a sound waveform to neural activities and processes them for recognition. During the process, the tuning to amplitude modulation (AM) is successively transformed by a cascade of brain regions. To test the functional significance of the AM tuning, we conducted single-unit recording in a deep neural network (DNN) trained for natural sound recognition.
View Article and Find Full Text PDFThere are indications that the pupillary dilation response (PDR) reflects surprising moments in an auditory sequence such as the appearance of a deviant noise against repetitively presented pure tones (4), and salient and loud sounds that are evaluated by human paricipants subjectively (12). In the current study, we further examined whether the reflection of PDR in auditory surprise can be accumulated and revealed in complex and yet structured auditory stimuli, i.e.
View Article and Find Full Text PDFThis article has been withdrawn: please see Elsevier Policy on Article Withdrawal (http://www.elsevier.com/locate/withdrawalpolicy).
View Article and Find Full Text PDFOur hearing is usually robust against reverberation. This study asked how such robustness to daily sound is realized, and what kinds of acoustic cues contribute to the robustness. We focused on the perception of materials based on impact sounds, which is a common daily experience, and for which the responsible acoustic features have already been identified in the absence of reverberation.
View Article and Find Full Text PDFInteraural time (ITD) and level differences (ILD) constitute the two main cues for sound localization in the horizontal plane. Despite extensive research in animal models and humans, the mechanism of how these two cues are integrated into a unified percept is still far from clear. In this study, our aim was to test with human electroencephalography (EEG) whether integration of dynamic ITD and ILD cues is reflected in the so-called motion-onset response (MOR), an evoked potential elicited by moving sound sources.
View Article and Find Full Text PDFInteraural time differences (ITD) and interaural level differences (ILD) both signal horizontal sound source location. To achieve a unified percept of our acoustic environment, these two cues require integration. In the present study, we tested this integration of ITD and ILD with electroencephalography (EEG) by measuring the mismatch negativity (MMN).
View Article and Find Full Text PDFThe two-tone sequence (ABA_), which comprises two different sounds (A and B) and a silent gap, has been used to investigate how the auditory system organizes sequential sounds depending on various stimulus conditions or brain states. Auditory streaming can be evoked by differences not only in the tone frequency ("spectral cue": ΔF, TONE condition) but also in the amplitude modulation rate ("AM cue": ΔF, AM condition). The aim of the present study was to explore the relationship between the perceptual properties of auditory streaming for the TONE and AM conditions.
View Article and Find Full Text PDFIn this series of behavioural experiments we investigated the effect of distraction on the maintenance of acoustic scene information in short-term memory. Stimuli are artificial acoustic 'scenes' composed of several (up to twelve) concurrent tone-pip streams ('sources'). A gap (1000 ms) is inserted partway through the 'scene'; Changes in the form of an appearance of a new source or disappearance of an existing source, occur after the gap in 50% of the trials.
View Article and Find Full Text PDFJ Assoc Res Otolaryngol
December 2016
This study examined whether the mechanical characteristics of the cochlea could influence individual variation in the ability to use temporal fine structure (TFS) information. Cochlear mechanical functioning was evaluated by swept-tone evoked otoacoustic emissions (OAEs), which are thought to comprise linear reflection by micromechanical impedance perturbations, such as spatial variations in the number or geometry of outer hair cells, on the basilar membrane (BM). Low-rate (2 Hz) frequency modulation detection limens (FMDLs) were measured for carrier frequency of 1000 Hz and interaural phase difference (IPD) thresholds as indices of TFS sensitivity and high-rate (16 Hz) FMDLs and amplitude modulation detection limens (AMDLs) as indices of sensitivity to non-TFS cues.
View Article and Find Full Text PDFTo make sense of complex auditory scenes, the auditory system sequentially organizes auditory components into perceptual objects or streams. In the conventional view of this process, the cortex plays a major role in perceptual organization, and subcortical mechanisms merely provide the cortex with acoustical features. Here, we show that the neural activities of the brainstem are linked to perceptual organization, which alternates spontaneously for human listeners without any stimulus change.
View Article and Find Full Text PDFThe performance of a lateralization task based on interaural time or level differences (ITDs or ILDs) often varies among listeners. This study examined the extent to which this inter-listener variation could be accounted for by the coding efficiency of the temporal-structure or level information below the stage of interaural interaction. Young listeners (20s to 30s) and early-elderly (60s) listeners with or without mild hearing loss were tested.
View Article and Find Full Text PDF