Attention capture by episodic long-term memory.

Cognition

Department of Psychology, University of Wisconsin - Milwaukee, Milwaukee, WI, USA. Electronic address:

Published: August 2020

Everyday behavior depends upon the operation of concurrent cognitive processes. In visual search, studies that examine memory-attention interactions have indicated that long-term memory facilitates search for a target (e.g., contextual cueing), but the potential for memories to capture attention and decrease search efficiency has not been investigated. To address this gap in the literature, five experiments were conducted to examine whether task-irrelevant encoded objects might capture attention. In each experiment, participants encoded scene-object pairs. Then, in a visual search task, 6-object search displays were presented and participants were told to make a single saccade to targets defined by shape (e.g., diamond among differently colored circles; Experiments 1, 4, and 5) or by color (e.g., blue shape among differently shaped gray objects; Experiments 2 and 3). Sometimes, one of the distractors was from the encoded set, and occasionally the scene that had been paired with that object was presented prior to the search display. Results indicated that eye movements were made, in error, more often to encoded distractors than to baseline distractors, and that this effect was greatest when the corresponding scene was presented prior to search. When capture did occur, participants looked longer at encoded distractors if scenes had been presented, an effect that we attribute to the representational match between a retrieved associate and the identity of the encoded distractor in the search display. In addition, the presence of a scene resulted in slower saccade deployment when participants made first saccades to targets, as instructed. Experiments 4 and 5 suggest that this slowdown may be due to the relatively rare and therefore, surprising, appearance of visual stimulus information prior to search. Collectively, results suggest that information encoded into episodic memory can capture attention, which is consistent with the recent proposal that selection history can guide attentional selection.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cognition.2020.104312DOI Listing

Publication Analysis

Top Keywords

capture attention
12
prior search
12
search
9
long-term memory
8
visual search
8
presented prior
8
search display
8
encoded distractors
8
encoded
7
attention
4

Similar Publications

Objective: The objective of this research is to enhance pneumonia detection in chest X-rays by leveraging a novel hybrid deep learning model that combines Convolutional Neural Networks (CNNs) with modified Swin Transformer blocks. This study aims to significantly improve diagnostic accuracy, reduce misclassifications, and provide a robust, deployable solution for underdeveloped regions where access to conventional diagnostics and treatment is limited.

Methods: The study developed a hybrid model architecture integrating CNNs with modified Swin Transformer blocks to work seamlessly within the same model.

View Article and Find Full Text PDF

The issue of whether a salient stimulus in the visual field captures attention in a stimulus-driven manner has been debated for several decades. The attentional window account proposed to resolve this issue by claiming that a salient stimulus captures attention and interferes with target processing only when an attentional window is set wide enough to encompass both the target and the salient distractor. By contrast, when a small attentional window is serially shifted among individual stimuli to find a target, no capture is found.

View Article and Find Full Text PDF

An enhanced Transformer framework with incremental learning for online stock price prediction.

PLoS One

January 2025

Harvard extension school, Harvard University, Boston, Massachusetts, United States of America.

To address the limitations of existing stock price prediction models in handling real-time data streams-such as poor scalability, declining predictive performance due to dynamic changes in data distribution, and difficulties in accurately forecasting non-stationary stock prices-this paper proposes an incremental learning-based enhanced Transformer framework (IL-ETransformer) for online stock price prediction. This method leverages a multi-head self-attention mechanism to deeply explore the complex temporal dependencies between stock prices and feature factors. Additionally, a continual normalization mechanism is employed to stabilize the data stream, enhancing the model's adaptability to dynamic changes.

View Article and Find Full Text PDF

Temporal constructs are central to reproduction and kinship, as epitomised by the pervasive concept of the biological clock within public imaginaries. While queer scholarship has problematised linear models of kinship and reproductive temporality, the specific temporalities associated with donor-conceived families have received less scholarly attention, despite the increasing prevalence of these family structures. In this article, we explore the question: how does donor conception reconfigure temporal logics.

View Article and Find Full Text PDF

Synergistic integration of brain networks and time-frequency multi-view feature for sleep stage classification.

Health Inf Sci Syst

December 2025

Faculty of Information Engineering and Automation, Kunming University of Science and Technology, No.727 Jingming South Road, Kunming, 650504 Yunnan China.

For diagnosing mental health conditions and assessing sleep quality, the classification of sleep stages is essential. Although deep learning-based methods are effective in this field, they often fail to capture sufficient features or adequately synthesize information from various sources. For the purpose of improving the accuracy of sleep stage classification, our methodology includes extracting a diverse array of features from polysomnography signals, along with their transformed graph and time-frequency representations.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!