We sought to establish whether novel words can become integrated into existing semantic networks by teaching participants new meaningful words and then using these new words as primes in two semantic priming experiments, in which participants carried out a lexical decision task to familiar words. Importantly, at no point in training did the novel words co-occur with the familiar words that served as targets in the primed lexical decision task, allowing us to evaluate semantic priming in the absence of direct association. We found that familiar words were primed by the newly related novel words, both when the novel word prime was unmasked (experiment 1) and when it was masked (experiment 2), suggesting that the new words had been integrated into semantic memory. Furthermore, this integration was strongest after a 1-week delay and was independent of explicit recall of the novel word meanings: Forgetting of meanings did not attenuate priming. We argue that even after brief training, newly learned words become an integrated part of the adult mental lexicon rather than being episodically represented separately from the lexicon.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1080/17470218.2012.724694 | DOI Listing |
Polymers (Basel)
January 2025
Faculty of Environmental Engineering, University of Science and Technology, 50-013 Wrocław, Poland.
The applications of polymeric materials are being constantly reviewed and improved. In the present world, the word hybrid, and the general idea of combining two or more inherently different approaches, designs, and materials is gaining significant attention. The area of sustainable materials with a low environmental impact is also rapidly evolving with many new discoveries, including the use of materials of a natural origin and countless combinations thereof.
View Article and Find Full Text PDFSensors (Basel)
January 2025
School of Computer Science, Shaanxi Normal University, Xi'an 710062, China.
Music generation by AI algorithms like Transformer is currently a research hotspot. Existing methods often suffer from issues related to coherence and high computational costs. To address these problems, we propose a novel Transformer-based model that incorporates a gate recurrent unit with root mean square norm restriction (TARREAN).
View Article and Find Full Text PDFBehav Res Methods
January 2025
Department of Computer Science, Colby College, 4000 Mayflower Hill, Waterville, 04901, Maine, USA.
In reading tasks, drift can move fixations from one word to another or even another line, invalidating the eye-tracking recording. Manual correction is time-consuming and subjective, while automated correction is fast - yet limited in accuracy. In this paper, we present Fix8 (Fixate), an open-source GUI tool that offers a novel semi-automated correction approach for eye-tracking data in reading tasks.
View Article and Find Full Text PDFSci Rep
January 2025
Nanfang College Guangzhou, Guangzhou, 510970, China.
Named Entity Recognition (NER) is an essential component of numerous Natural Language Processing (NLP) systems, with the aim of identifying and classifying entities that have specific meanings in raw text, such as person (PER), location (LOC), and organization (ORG). Recently, Deep Neural Networks (DNNs) have been extensively applied to NER tasks owing to the rapid development of deep learning technology. However, despite their advancements, these models fail to take full advantage of the multi-level features (e.
View Article and Find Full Text PDFBr J Dev Psychol
January 2025
Department of Psychology, Trinity University, San Antonio, Texas, USA.
This study investigates whether the context in which a word is learnt affects noun and verb learning. There is mixed evidence in studies of noun learning, and no studies of background perceptual context in verb learning. Two-, three-, and four-year-olds (n = 162) saw a novel object moved in a novel way while hearing four novel words, either nouns or verbs.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!