Objective: To investigate the impact of pediatric traumatic brain injury (TBI) on multisensory integration in relation to general neurocognitive functioning.
Method: Children with a hospital admission for TBI aged between 6 and 13 years (n = 94) were compared with children with trauma control (TC) injuries (n = 39), while differentiating between mild TBI without risk factors for complicated TBI (mild; n = 19), mild TBI with ≥1 risk factor (mild; n = 45), and moderate/severe TBI (n = 30). We measured set-shifting performance based on visual information (visual shift condition) and set-shifting performance based on audiovisual information, requiring multisensory integration (audiovisual shift condition). Effects of TBI on set-shifting performance were traced back to task strategy (i.e., boundary separation), processing efficiency (i.e., drift rate), or extradecisional processes (i.e., nondecision time) using diffusion model analysis. General neurocognitive functioning was measured using estimated full-scale IQ (FSIQ).
Results: The TBI group showed selectively reduced performance in the audiovisual shift condition (p = .009, Cohen's d = -0.51). Follow-up analyses in the audiovisual shift condition revealed reduced performance in the mildRF+ TBI group and moderate/severe TBI group (ps ≤ .025, ds ≤ -0.61). These effects were traced back to lower drift rate (ps ≤ .048, ds ≤ -0.44), reflecting reduced multisensory integration efficiency. Notably, accuracy and drift rate in the audiovisual shift condition partially mediated the relation between TBI and FSIQ.
Conclusion: Children with mildRF+ or moderate/severe TBI are at risk for reduced multisensory integration efficiency, possibly contributing to decreased general neurocognitive functioning. (PsycINFO Database Record
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1037/neu0000302 | DOI Listing |
J Cogn
January 2025
General Psychology, Trier University, Germany.
Observations from multisensory body illusions indicate that the body representation can be adapted to changing task demands, e.g., it can be expanded to integrate external objects based on current sensorimotor experience (embodiment).
View Article and Find Full Text PDFNeural Netw
January 2025
Department of Electrical, Computer and Biomedical Engineering, Toronto Metropolitan University, Toronto, Canada.
Research on video-based understanding and learning has attracted widespread interest and has been adopted in various real applications, such as e-healthcare, action recognition, affective computing, to name a few. Amongst them, video-based action recognition is one of the most representative examples. With the advancement of multi-sensory technology, action recognition using multi-modal data has recently drawn wide attention.
View Article and Find Full Text PDFCogn Neurodyn
December 2025
Department of Psychology, Graduate School of Humanities, Kobe University, 1-1 Rokkodai- cho, Nada, Kobe, 657-8501 Japan.
Unlabelled: The integration of auditory and visual stimuli is essential for effective language processing and social perception. The present study aimed to elucidate the mechanisms underlying audio-visual (A-V) integration by investigating the temporal dynamics of multisensory regions in the human brain. Specifically, we evaluated inter-trial coherence (ITC), a neural index indicative of phase resetting, through scalp electroencephalography (EEG) while participants performed a temporal-order judgment task that involved auditory (beep, A) and visual (flash, V) stimuli.
View Article and Find Full Text PDFCurr Biol
January 2025
Department of Translational Neuroscience, Wake Forest University School of Medicine, Winston-Salem, NC 27157, USA. Electronic address:
Flavor is the quintessential multisensory experience, combining gustatory, retronasal olfactory, and texture qualities to inform food perception and consumption behavior. However, the computations that govern multisensory integration of flavor components and their underlying neural mechanisms remain elusive. Here, we use rats as a model system to test the hypothesis that taste and smell components of flavor are integrated in a reliability-dependent manner to inform hedonic judgments and that this computation is performed by neurons in the primary taste cortex.
View Article and Find Full Text PDFCortex
December 2024
Institute of Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), Louvain Bionics Center, University of Louvain (UCLouvain), Louvain-la-Neuve, Belgium; School of Health Sciences, HES-SO Valais-Wallis, The Sense Innovation and Research Center, Lausanne & Sion, Switzerland. Electronic address:
Effective social communication depends on the integration of emotional expressions coming from the face and the voice. Although there are consistent reports on how seeing and hearing emotion expressions can be automatically integrated, direct signatures of multisensory integration in the human brain remain elusive. Here we implemented a multi-input electroencephalographic (EEG) frequency tagging paradigm to investigate neural populations integrating facial and vocal fearful expressions.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!