Visual perspective taking (VPT) is an integral part of social interactions. While the mechanisms of VPT have been extensively explored in human-human interactions, only a handful of studies have investigated the mechanisms that enable humans to also take the perspective of robots. Previous work has proposed that human-like visual features trigger VPT (mere-appearance hypothesis). In this study, we investigate the boundary conditions of the mere-appearance hypothesis in four experiments in a dot-matching task. We show that not only human-like visual features trigger VPT but that a camera also triggers VPT. Non-human animal-like features, conversely, do not result in VPT. We thus suggest that not only human-like visual features but also an object that is associated with an implied social presence (a camera) can trigger VPT whereas this is not the case for non-human animal-like features. These findings further extend earlier work on the mere-appearance hypothesis and are informative for designing social robots.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.concog.2023.103588 | DOI Listing |
Conscious Cogn
October 2023
Department of Philosophy, Ruhr-Universität Bochum, Bochum, Germany.
Visual perspective taking (VPT) is an integral part of social interactions. While the mechanisms of VPT have been extensively explored in human-human interactions, only a handful of studies have investigated the mechanisms that enable humans to also take the perspective of robots. Previous work has proposed that human-like visual features trigger VPT (mere-appearance hypothesis).
View Article and Find Full Text PDFCognition
July 2022
Department of Cognitive, Linguistic, and Psychological Sciences, Brown University, United States of America.
As robots rapidly enter society, how does human social cognition respond to their novel presence? Focusing on one foundational social-cognitive capacity-visual perspective taking-seven studies reveal that people spontaneously adopt a robot's unique perspective and do so with patterns of variation that mirror perspective taking toward humans. As they do with humans, people take a robot's visual perspective when it displays goal-directed actions. Moreover, perspective taking is absent when the agent lacks human appearance, increases when the agent looks highly humanlike, and persists even when the humanlike agent is perceived as eerie or as obviously lacking a mind.
View Article and Find Full Text PDFNucleic Acids Res
February 2022
Department of Chemical and Structural Biology, Weizmann Institute of Science 7610001 Rehovot, Israel.
Although the mode of action of the ribosomes, the multi-component universal effective protein-synthesis organelles, has been thoroughly explored, their mere appearance remained elusive. Our earlier comparative structural studies suggested that a universal internal small RNA pocket-like segment called by us the protoribosome, which is still embedded in the contemporary ribosome, is a vestige of the primordial ribosome. Herein, after constructing such pockets, we show using the "fragment reaction" and its analyses by MALDI-TOF and LC-MS mass spectrometry techniques, that several protoribosome constructs are indeed capable of mediating peptide-bond formation.
View Article and Find Full Text PDFJ Vis
August 2021
Department of Neuroscience, Psychology, Pharmacology and Child Health, University of Florence, Florence, Italy.
The perception of numerical quantities is susceptible to adaptation: after inspecting a numerous dot array for a few seconds a subsequent dot array is grossly underestimated. In a recent work we showed that the mere appearance of an additional numerically neutral stimulus significantly reduces the adaptation magnitude. Here we demonstrate that this reduction is likely due to a numerosity underestimation of the adaptor caused by a change of numerosity-related attentional resources deployed on the adapting stimulus.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!