A widely used example of the intricate (yet poorly understood) intertwining of multisensory signals in the brain is the audiovisual bounce inducing effect (ABE). This effect presents two identical objects moving along the azimuth with uniform motion and towards opposite directions. The perceptual interpretation of the motion is ambiguous and is modulated if a transient (sound) is presented in coincidence with the point of overlap of the two objects' motion trajectories. This phenomenon has long been written-off to simple attentional or decision-making mechanisms, although the neurological underpinnings for the effect are not well understood. Using behavioural metrics concurrently with event-related fMRI, we show that sound-induced modulations of motion perception can be further modulated by changing motion dynamics of the visual targets. The phenomenon engages the posterior parietal cortex and the parieto-insular-vestibular cortical complex, with a close correspondence of activity in these regions with behaviour. These findings suggest that the insular cortex is engaged in deriving a probabilistic perceptual solution through the integration of multisensory data.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.neuroimage.2022.119285 | DOI Listing |
J Vis
January 2025
Institut de Neurosciences de la Timone, CNRS & Aix-Marseille Université, Marseille, France.
Sensory-motor systems can extract statistical regularities in dynamic uncertain environments, enabling quicker responses and anticipatory behavior for expected events. Anticipatory smooth pursuit eye movements (aSP) have been observed in primates when the temporal and kinematic properties of a forthcoming visual moving target are fully or partially predictable. To investigate the nature of the internal model of target kinematics underlying aSP, we tested the effect of varying the target kinematics and its predictability.
View Article and Find Full Text PDFBehav Res Methods
January 2025
Neuroscience of Perception and Action Lab, Italian Institute of Technology (IIT), Viale Regina Elena 291, 00161, Rome, Italy.
Estimating how the human body moves in space and time-body kinematics-has important applications for industry, healthcare, and several research fields. Gold-standard methodologies capturing body kinematics are expensive and impractical for naturalistic recordings as they rely on infrared-reflective wearables and bulky instrumentation. To overcome these limitations, several algorithms have been developed to extract body kinematics from plain video recordings.
View Article and Find Full Text PDFSci Rep
January 2025
School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), Tehran, Iran.
The process of perceptual decision-making in the real world involves the aggregation of pieces of evidence into a final choice. Visual evidence is usually presented in different pieces, distributed across time and space. We wondered whether adding variation in the location of the received information would lead to differences in how subjects integrated visual information.
View Article and Find Full Text PDFJ Vis
January 2025
Department of Cognitive and Psychological Sciences, Graduate School of Informatics, Nagoya University, Aichi, Japan.
Humans can estimate the time and position of a moving object's arrival. However, numerous studies have demonstrated superior position estimation accuracy for descending objects compared with ascending objects. We tested whether the accuracy of position estimation for ascending and descending objects differs between the upper and lower visual fields.
View Article and Find Full Text PDFFront Neurol
December 2024
Department of Head and Neck Surgery and Brain Research Institute, David Geffen School of Medicine at UCLA, Los Angeles, CA, United States.
The relative accessibility and simplicity of vestibular sensing and vestibular-driven control of head and eye movements has made the vestibular system an attractive subject to experimenters and theoreticians interested in developing realistic quantitative models of how brains gather and interpret sense data and use it to guide behavior. Head stabilization and eye counter-rotation driven by vestibular sensory input in response to rotational perturbations represent natural, ecologically important behaviors that can be reproduced in the laboratory and analyzed using relatively simple mathematical models. Models drawn from dynamical systems and control theory have previously been used to analyze the behavior of vestibular sensory neurons.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!