Visual search is thought to be guided by an active visual working memory (VWM) representation of the task-relevant features, referred to as the search template. In three experiments using a probe technique, we investigated which eye movement metrics reveal which search template is activated prior to the search, and distinguish it from future relevant or no longer relevant VWM content. Participants memorized a target color for a subsequent search task, while being instructed to keep central fixation. Before the search display appeared, we briefly presented two task-irrelevant colored probe stimuli to the left and right from fixation, one of which could match the current target template. In all three experiments, participants made both more and larger eye movements towards the probe matching the target color. The bias was predominantly expressed in microsaccades, 100-250 ms after probe onset. Experiment 2 used a retro-cue technique to show that these metrics distinguish between relevant and dropped representations. Finally, Experiment 3 used a sequential task paradigm, and showed that the same metrics also distinguish between current and prospective search templates. Taken together, we show how subtle eye movements track task-relevant representations for selective attention prior to visual search.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1167/17.6.13 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!