Do dogs preferentially encode the identity of the target object or the location of others' actions?

Anim Cogn

Comparative Cognition, Messerli Research Institute, University of Veterinary Medicine of Vienna, Medical University of Vienna and University of Vienna, Veterinärplatz 1, Vienna, 1210, Austria.

Published: March 2024

The ability to make sense of and predict others' actions is foundational for many socio-cognitive abilities. Dogs (Canis familiaris) constitute interesting comparative models for the study of action perception due to their marked sensitivity to human actions. We tested companion dogs (N = 21) in two screen-based eye-tracking experiments, adopting a task previously used with human infants and apes, to assess which aspects of an agent's action dogs consider relevant to the agent's underlying intentions. An agent was shown repeatedly acting upon the same one of two objects, positioned in the same location. We then presented the objects in swapped locations and the agent approached the objects centrally (Experiment 1) or the old object in the new location or the new object in the old location (Experiment 2). Dogs' anticipatory fixations and looking times did not reflect an expectation that agents should have continued approaching the same object nor the same location as witnessed during the brief familiarization phase; this contrasts with some findings with infants and apes, but aligns with findings in younger infants before they have sufficient motor experience with the observed action. However, dogs' pupil dilation and latency to make an anticipatory fixation suggested that, if anything, dogs expected the agents to keep approaching the same location rather than the same object, and their looking times showed sensitivity to the animacy of the agents. We conclude that dogs, lacking motor experience with the observed actions of grasping or kicking performed by a human or inanimate agent, might interpret such actions as directed toward a specific location rather than a specific object. Future research will need to further probe the suitability of anticipatory looking as measure of dogs' socio-cognitive abilities given differences between the visual systems of dogs and primates.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10980658PMC
http://dx.doi.org/10.1007/s10071-024-01870-wDOI Listing

Publication Analysis

Top Keywords

object location
16
socio-cognitive abilities
8
infants apes
8
location object
8
motor experience
8
experience observed
8
dogs
7
location
7
object
6
dogs preferentially
4

Similar Publications

The gut-brain axis is a bidirectional communication pathway that modulates cognitive function. A dysfunctional gut-brain axis has been associated with cognitive impairments during aging. Therefore, we propose evaluating whether modulation of the gut microbiota through fecal microbiota transplantation (FMT) from young-trained donors (YT) to middle-aged or aged mice could enhance brain function and cognition in old age.

View Article and Find Full Text PDF

The lack of thematic continuity in dreams with scene and plot discontinuities.

Sleep Adv

December 2024

EPISTEME Research and Strategy, Brooklyn, NY, USA.

A central tenet of Freudian dream theory holds that there is thematic coherence within all dreams, even those containing scene and plot discontinuities. While other models support varying degrees of dream coherence, none address the question of how, or even whether, coherence can be identified in dreams with such discontinuities. Here, we objectively test the ability of judges to evaluate the coherence of individual dream narratives.

View Article and Find Full Text PDF

Retinotopic biases in contextual feedback signals to V1 for object and scene processing.

Curr Res Neurobiol

June 2025

Centre for Cognitive Neuroimaging, School of Psychology and Neuroscience, College of Medical, Veterinary and Life Sciences, University of Glasgow, 62 Hillhead Street, Glasgow, G12 8QB, United Kingdom.

Identifying the objects embedded in natural scenes relies on recurrent processing between lower and higher visual areas. How is cortical feedback information related to objects and scenes organised in lower visual areas? The spatial organisation of cortical feedback converging in early visual cortex during object and scene processing could be retinotopically specific as it is coded in V1, or object centred as coded in higher areas, or both. Here, we characterise object and scene-related feedback information to V1.

View Article and Find Full Text PDF

Despite cultural references to the dangers of hitchhiking, particularly for sexual homicide, no published research investigates these incidents from both an offender and crime scene perspective. Using the Sexual Homicide International Database (SHIelD), we explore lifestyle risk by comparing sexual homicide cases involving hitchhiking victims to those involving victims engaged in sex trade work. The results, based on the use of bivariate and multivariate statistics, indicate that offenders view hitchhiking victims as opportunities for confinement without physical restraint, often engaging in sexual acts and theft.

View Article and Find Full Text PDF

The snub-nosed, reclining, and serene image of the fetus is commonplace in cultural representations and analyses of obstetric ultrasound. Yet following the provocation of various feminist scholars, taking the fetal sonogram as the automatic object of concern vis-à-vis ultrasound cedes ground to anti-abortionists, who deploy fetal images to argue that life begins at conception and that the unborn are rights bearing subjects who must be protected. How might feminists escape this analytical trap, where discussions of ultrasonics must always be engaged in the act of debunking? This article orients away from the problem of fetal representation by employing a method which may appear to be wildly unsuitable: media archaeology.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!