When an action is familiar, we are able to anticipate how it will change the state of the world. These expectations can result from retrieval of action-outcome associations in the hippocampus and the reinstatement of anticipated outcomes in visual cortex. How does this role for the hippocampus in action-based prediction change over time? We use high-resolution fMRI and a dual-training behavioral paradigm to examine how the hippocampus interacts with visual cortex during predictive and nonpredictive actions learned either three days earlier or immediately before the scan. Just-learned associations led to comparable background connectivity between the hippocampus and V1/V2, regardless of whether actions predicted outcomes. However, three-day-old associations led to stronger background connectivity and greater differentiation between neural patterns for predictive vs. nonpredictive actions. Hippocampal prediction may initially reflect indiscriminate binding of co-occurring events, with action information pruning weaker associations and leading to more selective and accurate predictions over time.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6728336 | PMC |
http://dx.doi.org/10.1038/s41467-019-12016-9 | DOI Listing |
Exp Neurobiol
December 2024
Department of Brain and Cognitive Engineering, Korea University, Seoul 02841, Korea.
Research on brain aging using resting-state functional magnetic resonance imaging (rs-fMRI) has typically focused on comparing "older" adults to younger adults. Importantly, these studies have often neglected the middle age group, which is also significantly impacted by brain aging, including by early changes in motor, memory, and cognitive functions. This study aims to address this limitation by examining the resting state networks in middle-aged adults via an exploratory whole-brain ROI-to-ROI analysis.
View Article and Find Full Text PDFProc Natl Acad Sci U S A
January 2025
Department of Psychology, University of Pennsylvania, Philadelphia, PA 19104.
Human brain evolution is marked by a disproportionate expansion of cortical regions associated with advanced perceptual and cognitive functions. While this expansion is often attributed to the emergence of novel specialized brain areas, modifications to evolutionarily conserved cortical regions also have been linked to species-specific behaviors. Distinguishing between these two evolutionary outcomes has been limited by the ability to make direct comparisons between species.
View Article and Find Full Text PDFGigascience
January 2025
School of Computer Science, Hunan University of Technology, Zhuzhou 412007, Hunan, China.
Background: The accurate deciphering of spatial domains, along with the identification of differentially expressed genes and the inference of cellular trajectory based on spatial transcriptomic (ST) data, holds significant potential for enhancing our understanding of tissue organization and biological functions. However, most of spatial clustering methods can neither decipher complex structures in ST data nor entirely employ features embedded in different layers.
Results: This article introduces STMSGAL, a novel framework for analyzing ST data by incorporating graph attention autoencoder and multiscale deep subspace clustering.
J Vis
January 2025
Neural Information Processing Group, University of Tübingen, Tübingen, Germany.
Human performance in psychophysical detection and discrimination tasks is limited by inner noise. It is unclear to what extent this inner noise arises from early noise (e.g.
View Article and Find Full Text PDFUnlabelled: Neurophysiology studies propose that predictive coding is implemented via alpha/beta (8-30 Hz) rhythms that prepare specific pathways to process predicted inputs. This leads to a state of relative inhibition, reducing feedforward gamma (40-90 Hz) rhythms and spiking to predictable inputs. We refer to this model as predictive routing.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!