Neural correlates of visuo-tactile crossmodal paired-associate learning and memory in humans.

Neuroscience

NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai and Collaborative Innovation Center for Brain Science, Shanghai 200062, China; Department of Neurosurgery, School of Medicine, Johns Hopkins University, Baltimore, MD 21287, USA; Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD 21218, USA.

Published: October 2017

Studies have indicated that a cortical sensory system is capable of processing information from different sensory modalities. However, it still remains unclear when and how a cortical system integrates and retains information across sensory modalities during learning. Here we investigated the neural dynamics underlying crossmodal associations and memory by recording event-related potentials (ERPs) when human participants performed visuo-tactile (crossmodal) and visuo-visual (unimodal) paired-associate (PA) learning tasks. In a trial of the tasks, the participants were required to explore and learn the relationship (paired or non-paired) between two successive stimuli. EEG recordings revealed dynamic ERP changes during participants' learning of paired-associations. Specifically, (1) the frontal N400 component showed learning-related changes in both unimodal and crossmodal tasks but did not show any significant difference between these two tasks, while the central P400 displayed both learning changes and task differences; (2) a late posterior negative slow wave (LPN) showed the learning effect only in the crossmodal task; (3) alpha-band oscillations appeared to be involved in crossmodal working memory. Additional behavioral experiments suggested that these ERP components were not relevant to the participants' familiarity with stimuli per se. Further, by shortening the delay length (from 1300ms to 400ms or 200 ms) between the first and second stimulus in the crossmodal task, declines in participants' task performance were observed accordingly. Taken together, these results provide insights into the cortical plasticity (induced by PA learning) of neural networks involved in crossmodal associations in working memory.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neuroscience.2017.08.035DOI Listing

Publication Analysis

Top Keywords

crossmodal
8
visuo-tactile crossmodal
8
paired-associate learning
8
sensory modalities
8
crossmodal associations
8
crossmodal task
8
involved crossmodal
8
working memory
8
learning
7
neural correlates
4

Similar Publications

The near-miss to cross-modal commutativity.

Atten Percept Psychophys

January 2025

Department of Psychology, University of Tübingen, Schleichstr. 4, 72076, Tübingen, Germany.

This paper is a follow-up to Ellermeier, Kattner, and Raum (2021, Attention, Perception, & Psychophysics, 83, 2955-2967), and provides a reanalysis of their data on cross-modal commutativity from a Bayesian perspective, and a theory-based analysis grounded on a recently suggested extension of a global psychophysical approach to cross-modal judgments (Heller, 2021, Psychological Review, 128, 509-524). This theory assumes that stimuli are judged against respondent-generated internal references that are modality-specific and potentially role-dependent (i.e.

View Article and Find Full Text PDF

Previous research has shown that, when multiple similar items are maintained in working memory, recall precision declines. Less is known about how heterogeneous sets of items across different features within and between modalities impact recall precision. In two experiments, we investigated modality (Experiment 1, n = 79) and feature-specific (Experiment 2, n = 154) load effects on working memory performance.

View Article and Find Full Text PDF

Duration adaptation depends on the perceived rather than physical duration and can be observed across sensory modalities.

Perception

January 2025

State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, China; University of Chinese Academy of Sciences, China; Hefei Comprehensive National Science Center, Institute of Artificial Intelligence, China.

Previous research has indicated that exposure to sensory stimuli of short or long durations influences the perceived duration of subsequent stimuli within the same modality. However, it remains unclear whether this adaptation is driven by the stimulus physical duration or by the perceived duration. We hypothesized that the absence of cross-modal duration adaptation observed in earlier studies was due to the mismatched perceived durations of adapting stimuli.

View Article and Find Full Text PDF

Multiplexed Immunofluorescence (MxIF) enables detailed immune cell phenotyping, providing critical insights into cell behavior within the tumor immune microenvironment (TIME). However, signal integrity can be compromised due to the complex cyclic staining processes inherent to MxIF. Hematoxylin and Eosin (H&E) staining, on the other hand, offers complementary information through its depiction of cell morphology and texture patterns and is often visually cross-referenced with MxIF in clinical settings.

View Article and Find Full Text PDF

In order to solve the limitations of flipped classroom in personalized teaching and interactive effect improvement, this paper designs a new model of flipped classroom in colleges and universities based on Virtual Reality (VR) by combining the algorithm of Contrastive Language-Image Pre-Training (CLIP). Through cross-modal data fusion, the model deeply combines students' operation behavior with teaching content, and improves teaching effect through intelligent feedback mechanism. The test data shows that the similarity between video and image modes reaches 0.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!