The effects of word predictability and shared semantic similarity between a target word and other words that could have taken its place in a sentence on language comprehension are investigated using data from a reading time study, a sentence completion study, and linear mixed-effects regression modeling. We find that processing is facilitated if the different possible words that could occur in a given context are semantically similar to each other, meaning that processing is affected not only by the nature of the words that do occur, but also the relationships between the words that do occur and those that could have occurred. We discuss possible causes of the semantic similarity effect and point to possible limitations of using probability as a model of cognitive effort.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.cognition.2011.11.011 | DOI Listing |
J Exp Psychol Learn Mem Cogn
December 2024
Basque Center on Cognition, Brain and Language.
The present study uses event-related potentials (ERPs) to investigate lexicosemantic prediction in native speakers (L1) of English and advanced second language (L2) learners of English with Swedish as their L1. The main goal of the study was to examine whether learners recruit predictive mechanisms to the same extent as L1 speakers when a change in the linguistic environment renders prediction a useful strategy to pursue. The study, which uses a relatedness proportion paradigm adapted from Lau et al.
View Article and Find Full Text PDFPsychol Aging
January 2025
Department of Psychology, National Taiwan University.
The Socioemotional Selectivity Theory (SST) posits that older and younger adults have different life goals due to differences in perceived remaining lifetime. Younger adults focus more on future-oriented knowledge exploration and forming new friendships, while older adults prioritize present-focused emotional regulation and maintaining close relationships. While previous research has found these age differences manifest in autobiographical textual expressions, their presence in verbal communication remains unexplored.
View Article and Find Full Text PDFCogn Neurodyn
December 2025
Image Processing Laboratory, University of Valencia, Valencia, Spain.
In recent years, substantial strides have been made in the field of visual image reconstruction, particularly in its capacity to generate high-quality visual representations from human brain activity while considering semantic information. This advancement not only enables the recreation of visual content but also provides valuable insights into the intricate processes occurring within high-order functional brain regions, contributing to a deeper understanding of brain function. However, considering fusion semantics in reconstructing visual images from brain activity involves semantic-to-image guide reconstruction and may ignore underlying neural computational mechanisms, which does not represent true reconstruction from brain activity.
View Article and Find Full Text PDFInt J Psychophysiol
January 2025
University of Warsaw, Faculty of Psychology, Poland.
Lukács et al. (2017) enhanced the Reaction Time Concealed Information Test (RT CIT) by incorporating "filler" items. Fillers are intended to increase attention and cognitive load, which should potentially enhance the P300 based CIT (P300-CIT) too.
View Article and Find Full Text PDFInflamm Res
January 2025
Department of Nephrology, First Affiliated Hospital of Naval Medical University, Shanghai Changhai Hospital, Shanghai, China.
Background: Chronic inflammation is well recognized as a key factor related to renal function deterioration in patients with diabetic kidney disease (DKD). Neutrophil extracellular traps (NETs) play an important role in amplifying inflammation. With respect to NET-related genes, the aim of this study was to explore the mechanism of DKD progression and therefore identify potential intervention targets.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!