Near-scale environments, like work desks, restaurant place settings or lab benches, are the interface of our hand-based interactions with the world. How are our conceptual representations of these environments organized? What properties distinguish among reachspaces, and why? We obtained 1.25 million similarity judgments on 990 reachspace images, and generated a 30-dimensional embedding which accurately predicts these judgments.
View Article and Find Full Text PDFNear-scale spaces are a key component of our visual experience: Whether for work or for leisure, we spend much of our days immersed in, and acting upon, the world within reach. Here, we present the Reachspace Database, a novel stimulus set containing over 10,000 images depicting first person, motor-relevant views at an approximated reachable scale (hereafter "reachspaces"), which reflect the visual input that an agent would experience while performing a task with her hands. These images are divided into over 350 categories, based on a taxonomy we developed, which captures information relating to the identity of each reachspace, including the broader setting and room it is found in, the locus of interaction (e.
View Article and Find Full Text PDFProc Natl Acad Sci U S A
November 2020
Space-related processing recruits a network of brain regions separate from those recruited in object processing. This dissociation has largely been explored by contrasting views of navigable-scale spaces to views of close-up, isolated objects. However, in naturalistic visual experience, we encounter spaces intermediate to these extremes, like the tops of desks and kitchen counters, which are not navigable but typically contain multiple objects.
View Article and Find Full Text PDFSearching for a "Q" among "O"s is easier than the opposite search (Treisman & Gormican in Psychological Review, 95, 15-48, 1988). In many cases, such "search asymmetries" occur because it is easier to search when a target is defined by the presence of a feature (i.e.
View Article and Find Full Text PDFJ Exp Psychol Hum Percept Perform
June 2019
In everyday experience, we interact with objects and we navigate through space. Extensive research has revealed that these visual behaviors are mediated by separable object-based and scene-based processing mechanisms in the mind and brain. However, we also frequently view near-scale spaces, for example, when sitting at the breakfast table or preparing a meal.
View Article and Find Full Text PDFActa Psychol (Amst)
September 2016
Previous work has shown that recall of objects that are incidentally encountered as targets in visual search is better than recall of objects that have been intentionally memorized (Draschkow, Wolfe, & Võ, 2014). However, this counter-intuitive result is not seen when these tasks are performed with non-scene stimuli. The goal of the current paper is to determine what features of search in a scene contribute to higher recall rates when compared to a memorization task.
View Article and Find Full Text PDFJ Exp Psychol Hum Percept Perform
December 2015
In "hybrid" search tasks, observers hold multiple possible targets in memory while searching for those targets among distractor items in visual displays. Wolfe (2012) found that, if the target set is held constant over a block of trials, reaction times (RTs) in such tasks were a linear function of the number of items in the visual display and a linear function of the log of the number of items held in memory. However, in such tasks, the targets can become far more familiar than the distractors.
View Article and Find Full Text PDF