For many cochlear implant (CI) users, visual cues are vitally important for interpreting the impoverished auditory speech information that an implant conveys. Although the temporal relationship between auditory and visual stimuli is crucial for how this information is integrated, audiovisual temporal processing in CI users is poorly understood. In this study, we tested unisensory (auditory alone, visual alone) and multisensory (audiovisual) temporal processing in postlingually deafened CI users (n = 48) and normal-hearing controls (n = 54) using simultaneity judgment (SJ) and temporal order judgment (TOJ) tasks. We varied the timing onsets between the auditory and visual components of either a syllable/viseme or a simple flash/beep pairing, and participants indicated either which stimulus appeared first (TOJ) or if the pair occurred simultaneously (SJ). Results indicate that temporal binding windows-the interval within which stimuli are likely to be perceptually 'bound'-are not significantly different between groups for either speech or non-speech stimuli. However, the point of subjective simultaneity for speech was less visually leading in CI users, who interestingly, also had improved visual-only TOJ thresholds. Further signal detection analysis suggests that this SJ shift may be due to greater visual bias within the CI group, perhaps reflecting heightened attentional allocation to visual cues.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6063927 | PMC |
http://dx.doi.org/10.1038/s41598-018-29598-x | DOI Listing |
Front Psychiatry
December 2024
Department of Psychiatry, Social Psychiatry and Psychotherapy, Hannover Medical School, Hannover, Germany.
Introduction: Multisensory integration (MSI) enhances perception by combining information from different sensory modalities. In schizophrenia, individuals often exhibit impaired audiovisual processing, resulting in broader temporal binding windows (TBWs) which appear to be associated with symptom severity. Since the underlying mechanisms of these aberrations are not yet fully understood, the present study aims to investigate multisensory processing in schizophrenia in more detail.
View Article and Find Full Text PDFHear Res
January 2025
Department of ENT - Head and Neck Surgery, Inselspital, Bern University Hospital, University of Bern 3010 Bern, Switzerland. Electronic address:
Objectives: Understanding brain processing of auditory and visual speech is essential for advancing speech perception research and improving clinical interventions for individuals with hearing impairment. Functional near-infrared spectroscopy (fNIRS) is deemed to be highly suitable for measuring brain activity during language tasks. However, accurate data interpretation also requires validated stimuli and behavioral measures.
View Article and Find Full Text PDFElife
December 2024
Instituto de Fisiología y Biología Molecular y Celular, Consejo Nacional de Investigaciones Científicas y Tecnológicas, Buenos Aires, Argentina.
Multisensory integration (MSI) combines information from multiple sensory modalities to create a coherent perception of the world. In contexts where sensory information is limited or equivocal, it also allows animals to integrate individually ambiguous stimuli into a clearer or more accurate percept and, thus, react with a more adaptive behavioral response. Although responses to multisensory stimuli have been described at the neuronal and behavioral levels, a causal or direct link between these two is still missing.
View Article and Find Full Text PDFAnn Neurosci
August 2024
Stress and Cognitive Electroimaging Laboratory, Department of Physiology, All India Institute of Medical Sciences, New Delhi, Delhi, India.
Background: The fascinating ability of brain to integrate information from multiple sensory inputs has intrigued many researchers. Audio-visual (AV) interaction is a form of multisensory integration which we encounter to form meaningful representations of the environment around us. There is limited literature related to the underlying neural mechanisms.
View Article and Find Full Text PDFbioRxiv
October 2024
Department of Psychology, New York University.
Cross-modal temporal recalibration guarantees stable temporal perception across everchanging environments. Yet, the mechanisms of cross-modal temporal recalibration remain unknown. Here, we conducted an experiment to measure how participants' temporal perception was affected by exposure to audiovisual stimuli with consistent temporal delays.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!