Controversy exists as to whether, compared to young adults, older adults are more, equally or less likely to make linguistic predictions while reading. While previous studies have examined age effects on the prediction of upcoming words, the prediction of upcoming syntactic structures has been largely unexplored. We compared the benefit that young and older readers gain when the syntactic structure is made predictable, as well as potential age differences in the costs involved in making predictions. In a self-paced reading study, 60 young and 60 older adults read sentences in which noun-phrase coordination (e.g. large pizza or tasty calzone) is made predictable through the inclusion of the word either earlier in the sentence. Results showed a benefit of the presence of either in the second half of the coordination phrase, and a cost of the presence of either in the first half. We observed no age differences in the benefit or costs of making these predictions; Bayes factor analyses offered strong evidence that these effects are age invariant. Together, these findings suggest that both older and younger adults make similar strength syntactic predictions with a similar level of difficulty. We relate this age invariance in syntactic prediction to specific aspects of the ageing process.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10087647PMC
http://dx.doi.org/10.1111/bjop.12594DOI Listing

Publication Analysis

Top Keywords

syntactic prediction
8
self-paced reading
8
age invariant
8
older adults
8
prediction upcoming
8
young older
8
age differences
8
making predictions
8
age
6
syntactic
5

Similar Publications

We introduce EmoAtlas, a computational library/framework extracting emotions and syntactic/semantic word associations from texts. EmoAtlas combines interpretable artificial intelligence (AI) for syntactic parsing in 18 languages and psychologically validated lexicons for detecting the eight emotions in Plutchik's theory. We show that EmoAtlas can match or surpass transformer-based natural language processing techniques, BERT or large language models like ChatGPT 3.

View Article and Find Full Text PDF

Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention network.

Sci Rep

January 2025

Key Laboratory of Ethnic Language Intelligent Analysis and Security Governance of MOE, Minzu University of China, Beijing, 100081, China.

Aspect Category Sentiment Analysis (ACSA) is a fine-grained sentiment analysis task aimed at predicting the sentiment polarity associated with aspect categories within a sentence.Most existing ACSA methods are based on a given aspect category to locate sentiment words related to it. When irrelevant sentiment words have semantic meaning for the given aspect category, it may cause the problem that sentiment words cannot be matched with aspect categories.

View Article and Find Full Text PDF

The link between the cognitive effort of word processing and the eye movement patterns elicited by that word is well established in psycholinguistic research using eye tracking. Yet less evidence or consensus exists regarding whether the same link exists between complexity linguistic complexity measures of a sentence or passage, and eye movements registered at the sentence or passage level. This paper focuses on "global" measures of syntactic and lexical complexity, i.

View Article and Find Full Text PDF

Grammar-constrained decoding for structured information extraction with fine-tuned generative models applied to clinical trial abstracts.

Front Artif Intell

January 2025

Center for Cognitive Interaction Technology (CITEC), Technical Faculty, Bielefeld University, Bielefeld, Germany.

Background: In the field of structured information extraction, there are typically semantic and syntactic constraints on the output of information extraction (IE) systems. These constraints, however, can typically not be guaranteed using standard (fine-tuned) encoder-decoder architectures. This has led to the development of constrained decoding approaches which allow, e.

View Article and Find Full Text PDF

Studies of perception have long shown that the brain adds information to its sensory analysis of the physical environment. A touchstone example for humans is language use: to comprehend a physical signal like speech, the brain must add linguistic knowledge, including syntax. Yet, syntactic rules and representations are widely assumed to be atemporal (i.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!