It has been hypothesized that crossmodal recalibration plays a crucial role for the development of multisensory integration capabilities [1]. To test the developmental trajectory of multisensory integration and crossmodal recalibration, we used a combined ventriloquist/ventriloquist aftereffect paradigm [2] in children aged 5-9 years. The ventriloquist effect (indicating multisensory integration), that is, the shift of auditory localization toward simultaneously presented but spatially discrepant visual stimuli, was larger in children than in adults, which was attributed to a lower auditory localization precision in the children. In fact, the size of the ventriloquist effect depended on the visual stimulus reliability in both children and adults. In all groups, the ventriloquist effect was best explained by a causal inference model. In contrast to their multisensory integration capabilities, 5-year-old children did not recalibrate. The immediate ventriloquist aftereffect (indicating recalibration after a single exposure to a spatially discrepant audio-visual stimulus) emerged in 6- to 7-year-old children, whereas the cumulative ventriloquist aftereffect (reflecting recalibration to the audio-visual spatial discrepancies over the complete experiment) was not observed before the age of 8 years. First, in contrast to common beliefs, the present results provide evidence that multisensory integration precedes rather than follows crossmodal recalibration during development. Second, we report developmental evidence for a dissociation of the processes involved in multisensory integration and immediate as well as cumulative recalibration. We speculate that multisensory integration is a prerequisite for crossmodal recalibration, because the multisensory percept, rather than unimodal cues, might comprise a crucial signal for the calibration of the sensory systems.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.cub.2020.02.048 | DOI Listing |
iScience
January 2025
Laboratory for Neuroengineering, Department of Health Science and Technology, Institute for Robotics and Intelligent Systems, ETH Zürich, 8092 Zürich, Switzerland.
Our brain combines sensory inputs to create a univocal perception, enhanced when stimuli originate from the same location. Following amputation, distorted body representations may disrupt visuo-tactile integration at the amputated leg. We aim to unveil the principles guiding optimal and cognitive-efficient visuo-tactile integration at both intact and amputated legs.
View Article and Find Full Text PDFeNeuro
January 2025
Department of Neuroscience, University of Connecticut School of Medicine, Farmington, Connecticut 06030
The study of the neural circuitry underlying complex mammalian decision-making, particularly cognitive flexibility, is critical for understanding psychiatric disorders. To test cognitive flexibility, as well as potentially other decision-making paradigms involving multimodal sensory perception, we developed FlexRig, an open-source, modular behavioral platform for use in head-fixed mice. FlexRig enables the administration of tasks relying upon olfactory, somatosensory, and/or auditory cues and employing left and right licking as a behavior readout and reward delivery mechanism.
View Article and Find Full Text PDFJ Neurosci
January 2025
Department of Physical Therapy, Movement and Rehabilitation Sciences, Northeastern University, Boston, MA 02115, USA.
Humans adjust their movement to changing environments effortlessly via multisensory integration of the effector's state, motor commands, and sensory feedback. It is postulated that frontoparietal (FP) networks are involved in the control of prehension, with dorsomedial (DM) and dorsolateral (DL) regions processing the reach and the grasp, respectively. This study tested (5F, 5M participants) the differential involvement of FP nodes (ventral premotor cortex - PMv, dorsal premotor cortex - PMd, anterior intraparietal sulcus - aIPS, and anterior superior parietal-occipital cortex - aSPOC) in online adjustments of reach-to-grasp coordination to mechanical perturbations that disrupted arm transport.
View Article and Find Full Text PDFJ Exp Psychol Gen
January 2025
Department of Experimental Psychology, Helmholtz Institute, Utrecht University.
Predicting the location of moving objects in noisy environments is essential to everyday behavior, like when participating in traffic. Although many objects provide multisensory information, it remains unknown how humans use multisensory information to localize moving objects, and how this depends on expected sensory interference (e.g.
View Article and Find Full Text PDFiScience
January 2025
Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland.
The recognition of conspecifics, animals of the same species, and keeping track of changes in the social environment is essential to all animals. While molecules, circuits, and brain regions that control social behaviors across species are studied in-depth, the neural mechanisms that enable the recognition of social cues are largely obscure. Recent evidence suggests that social cues across sensory modalities converge in a thalamic area conserved across vertebrates.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!