This article addresses 2 questions that arise from the finding that visual scenes are first parsed into visual features: (a) the accumulation of location information about objects during their recognition and (b) the mechanism for the binding of the visual features. The first 2 experiments demonstrated that when 2 colored letters were presented outside the initial focus of attention, illusory conjunctions between the color of one letter and the shape of the other were formed only if the letters were less than 1 degree apart. Separation greater than 2 degrees resulted in fewer conjunction errors than expected by chance. Experiments 3 and 4 showed that inside the spread of attention, illusory conjunctions between the 2 letters can occur regardless of the distance between them. In addition, these experiments demonstrated that the span of attention can expand or shrink like a spotlight. The results suggest that features inside the focus of attention are integrated by an expandable focal attention mechanism that conjoins all features that appear inside its focus. Visual features outside the focus of attention may be registered with coarse location information prior to their integration. Alternatively, a quick and imprecise shift of attention to the periphery may lead to illusory conjunctions among adjacent stimuli.

Download full-text PDF

Source
http://dx.doi.org/10.1037//0096-1523.15.4.650DOI Listing

Publication Analysis

Top Keywords

illusory conjunctions
16
focus attention
16
inside focus
12
visual features
12
attention
8
experiments demonstrated
8
attention illusory
8
focus
5
features
5
illusory
4

Similar Publications

Predicting visual function by interpreting a neuronal wiring diagram.

Nature

October 2024

Neuroscience Institute and Computer Science Department, Princeton University, Princeton, NJ, USA.

As connectomics advances, it will become commonplace to know far more about the structure of a nervous system than about its function. The starting point for many investigations will become neuronal wiring diagrams, which will be interpreted to make theoretical predictions about function. Here I demonstrate this emerging approach with the Drosophila optic lobe, analysing its structure to predict that three Dm3 (refs.

View Article and Find Full Text PDF

We perceive visual objects as unified although different brain areas process different features. An attentional mechanism has been proposed to be involved with feature binding, as evidenced by observations of binding errors (i.e.

View Article and Find Full Text PDF

Attention and feature binding in the temporal domain.

Psychon Bull Rev

December 2024

Department of Psychological Sciences, Birkbeck College, University of London, London, UK.

Previous studies have shown that illusory conjunction can emerge for both spatially and temporally proximal objects. However, the mechanisms involved in binding in the temporal domain are not yet fully understood. In the current study, we investigated the role of attentional processes in correct and incorrect temporal binding, and specifically how feature binding is affected by the speed of attentional engagement.

View Article and Find Full Text PDF

People shift their attention in the direction of another person's gaze. This phenomenon, called gaze cuing, shares properties with purely endogenous (i.e.

View Article and Find Full Text PDF

Previous studies of illusory conjunction (IC) mainly focused on alphabetic languages, while researchers have poorly understood the IC mechanism of Chinese words as an ideographic writing system. In the present study, we aimed to investigate the dynamic changes of IC effects for Chinese words under different stimulus exposure times and spatial arrangements. We conducted two experiments with a 3 (Condition: IC, non-IC-same, non-IC-different) × 3 (Exposure time: 38 ms, 88 ms, 138 ms) within-subject design.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!