Although animals rarely use only one sense to communicate, few studies have investigated the use of combinations of different signals between animals and humans. This study assessed for the first time the spontaneous reactions of piglets to human pointing gestures and voice in an object-choice task with a reward. Piglets (Sus scrofa domestica) mainly use auditory signals-individually or in combination with other signals-to communicate with their conspecifics. Their wide hearing range (42 Hz to 40.5 kHz) fits the range of human vocalisations (40 Hz to 1.5 kHz), which may induce sensitivity to the human voice. However, only their ability to use visual signals from humans, especially pointing gestures, has been assessed to date. The current study investigated the effects of signal type (visual, auditory and combined visual and auditory) and piglet experience on the piglets' ability to locate a hidden food reward over successive tests. Piglets did not find the hidden reward at first presentation, regardless of the signal type given. However, they subsequently learned to use a combination of auditory and visual signals (human voice and static or dynamic pointing gestures) to successfully locate the reward in later tests. This learning process may result either from repeated presentations of the combination of static gestures and auditory signals over successive tests, or from transitioning from static to dynamic pointing gestures, again over successive tests. Furthermore, piglets increased their chance of locating the reward either if they did not go straight to a bowl after entering the test area or if they stared at the experimenter before visiting it. Piglets were not able to use the voice direction alone, indicating that a combination of signals (pointing and voice direction) is necessary. Improving our communication with animals requires adapting to their individual sensitivity to human-given signals.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5085045 | PMC |
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0164988 | PLOS |
Nat Commun
January 2025
Key Lab of Fabrication Technologies for Integrated Circuits Institute of Microelectronics, Chinese Academy of Sciences, 100029, Beijing, China.
Visual sensors, including 3D light detection and ranging, neuromorphic dynamic vision sensor, and conventional frame cameras, are increasingly integrated into edge-side intelligent machines. However, their data are heterogeneous, causing complexity in system development. Moreover, conventional digital hardware is constrained by von Neumann bottleneck and the physical limit of transistor scaling.
View Article and Find Full Text PDFFront Psychol
January 2025
Department of Motor Behavior in Sports, Institute of Health Promotion and Clinical Movement Science, German Sport University Cologne, Cologne, Germany.
Introduction: Both appraisal emotion approaches and self-regulation theory emphasize that appraising an event as conducive or detrimental to one's current goals may trigger an affective response that can be observed nonverbally. Because there may be a female advantage in the inhibition and self-regulation of emotions, we hypothesized that female but not male athletes regulate emotions during sports through explicit nonverbal behaviors.
Methods: All nonverbal hand movement behavior of right-handed female and male tennis athletes was recorded during competitive matches.
Front Psychol
December 2024
Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, Netherlands.
The key function of storytelling is a meeting of hearts: a resonance in the recipient(s) of the story narrator's emotion toward the story events. This paper focuses on the role of gestures in engendering emotional resonance in conversational storytelling. The paper asks three questions: Does story narrators' gesture expressivity increase from story onset to climax offset (RQ #1)? Does gesture expressivity predict specific EDA responses in story participants (RQ #2)? How important is the contribution of gesture expressivity to emotional resonance compared to the contribution of other predictors of resonance (RQ #3)? 53 conversational stories were annotated for a large number of variables including Protagonist, Recency, Group composition, Group size, Sentiment, and co-occurrence with quotation.
View Article and Find Full Text PDFSensors (Basel)
January 2025
Cognitive Systems Lab, University of Bremen, 28359 Bremen, Germany.
Over recent years, automated Human Activity Recognition (HAR) has been an area of concern for many researchers due to its widespread application in surveillance systems, healthcare environments, and many more. This has led researchers to develop coherent and robust systems that efficiently perform HAR. Although there have been many efficient systems developed to date, still, there are many issues to be addressed.
View Article and Find Full Text PDFAnimals (Basel)
December 2024
Department of Animal Science, Biotechnical Faculty, University of Ljubljana, Groblje 3, 1230 Domžale, Slovenia.
Our understanding of social cognition in brachycephalic dog breeds is limited. This study focused specifically on French Bulldogs and hypothesized that a closer relationship between dog and owner would improve the dogs' understanding of nonverbal cues, particularly pointing gestures. To investigate this, we tested twenty-six dogs and their owners in a two-way object choice test in which the familiar person pointed to the bowl.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!