Auditory cues facilitate object movement processing in human extrastriate visual cortex during simulated self-motion: A pilot study.

Brain Res

Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, Charlestown, MA, USA; Department of Radiology, Harvard Medical School, Boston, MA, USA. Electronic address:

Published: August 2021

Visual segregation of moving objects is a considerable computational challenge when the observer moves through space. Recent psychophysical studies suggest that directionally congruent, moving auditory cues can substantially improve parsing object motion in such settings, but the exact brain mechanisms and visual processing stages that mediate these effects are still incompletely known. Here, we utilized multivariate pattern analyses (MVPA) of MRI-informed magnetoencephalography (MEG) source estimates to examine how crossmodal auditory cues facilitate motion detection during the observer's self-motion. During MEG recordings, participants identified a target object that moved either forward or backward within a visual scene that included nine identically textured objects simulating forward observer translation. Auditory motion cues 1) improved the behavioral accuracy of target localization, 2) significantly modulated the MEG source activity in the areas V2 and human middle temporal complex (hMT+), and 3) increased the accuracy at which the target movement direction could be decoded from hMT+ activity using MVPA. The increase of decoding accuracy by auditory cues in hMT+ was significant also when superior temporal activations in or near auditory cortices were regressed out from the hMT+ source activity to control for source estimation biases caused by point spread. Taken together, these results suggest that parsing object motion from self-motion-induced optic flow in the human extrastriate visual cortex can be facilitated by crossmodal influences from auditory system.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8206020PMC
http://dx.doi.org/10.1016/j.brainres.2021.147489DOI Listing

Publication Analysis

Top Keywords

auditory cues
16
cues facilitate
8
human extrastriate
8
extrastriate visual
8
visual cortex
8
parsing object
8
object motion
8
meg source
8
accuracy target
8
source activity
8

Similar Publications

Noncanonical sentence structures pose comprehension challenges because they require increased cognitive demand. Prosody may partially alleviate this cognitive load. These findings largely stem from behavioral studies, yet physiological measures may reveal additional insights into how cognition is deployed to parse sentences.

View Article and Find Full Text PDF

Objectives: To investigate the influence of frequency-specific audibility on audiovisual benefit in children, this study examined the impact of high- and low-pass acoustic filtering on auditory-only and audiovisual word and sentence recognition in children with typical hearing. Previous studies show that visual speech provides greater access to consonant place of articulation than other consonant features and that low-pass filtering has a strong impact on perception on acoustic consonant place of articulation. This suggests visual speech may be particularly useful when acoustic speech is low-pass filtered because it provides complementary information about consonant place of articulation.

View Article and Find Full Text PDF

Neurons in the hippocampus are correlated with different variables, including space, time, sensory cues, rewards and actions, in which the extent of tuning depends on ongoing task demands. However, it remains uncertain whether such diverse tuning corresponds to distinct functions within the hippocampal network or whether a more generic computation can account for these observations. Here, to disentangle the contribution of externally driven cues versus internal computation, we developed a task in mice in which space, auditory tones, rewards and context were juxtaposed with changing relevance.

View Article and Find Full Text PDF

This study investigates the acquisition of sentence focus in Russian by adult English-Russian bilinguals, while paying special attention to the relative contribution of constituent order and prosodic expression. It aims to understand how these factors influence perceived word-level prominence and focus assignment during listening. We present results of two listening tasks designed to examine the influence of pitch cues and constituent order on perceived word prominence (Experiment 1) and focus assignment (Experiment 2) during the auditory comprehension of SV[O] and OV[S] sentences in Russian.

View Article and Find Full Text PDF

Individual differences in temporal order judgment.

Sci Rep

January 2025

Department of Psychology, Bar-Ilan University, 5290002, Ramat-Gan, Israel.

Large individual differences can be observed in studies reporting spectral TOJ. In the present study, we aimed to explore these individual differences and explain them by employing Warren and Ackroff (1976) framework of direct identification of components and their order (direct ICO) and holistic pattern recognition (HPR). In Experiment 1, results from 177 participants replicated the large variance in participants' performance and suggested three response patterns, validated using the K-Means clustering algorithm.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!