This study aimed to prove that deep learning can be effectively used for identifying various equine facial expressions as welfare indicators. In this study, a total of 749 horses (healthy: 586 and experiencing pain: 163) were investigated. Moreover, a model for recognizing facial expressions based on images and their classification into four categories, i.e., resting horses (RH), horses with pain (HP), horses immediately after exercise (HE), and horseshoeing horses (HH), was developed. The normalization of equine facial posture revealed that the profile (99.45%) had higher accuracy than the front (97.59%). The eyes-nose-ears detection model achieved an accuracy of 98.75% in training, 81.44% in validation, and 88.1% in testing, with an average accuracy of 89.43%. Overall, the average classification accuracy was high; however, the accuracy of pain classification was low. These results imply that various facial expressions in addition to pain may exist in horses depending on the situation, degree of pain, and type of pain experienced by horses. Furthermore, automatic pain and stress recognition would greatly enhance the identification of pain and other emotional states, thereby improving the quality of equine welfare.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10141195 | PMC |
http://dx.doi.org/10.3390/vetsci10040283 | DOI Listing |
Rev Neurosci
January 2025
557765 Network of Neurosurgery and Artificial Intelligence (NONAI), Universal Scientific Education and Research Network (USERN ), Tehran, Iran.
The recognition and classification of facial expressions using artificial intelligence (AI) presents a promising avenue for early detection and monitoring of neurodegenerative disorders. This narrative review critically examines the current state of AI-driven facial expression analysis in the context of neurodegenerative diseases, such as Alzheimer's and Parkinson's. We discuss the potential of AI techniques, including deep learning and computer vision, to accurately interpret and categorize subtle changes in facial expressions associated with these pathological conditions.
View Article and Find Full Text PDFSci Rep
January 2025
Guardian Robot Project, RIKEN, Kyoto, Japan.
Trust and rapport are essential abilities for human-robot interaction. Producing emotional expressions in the robots' faces is an effective way for that purpose. Androids can show human-like facial expressions of basic emotions.
View Article and Find Full Text PDFCommun Biol
January 2025
Department of Biology, Loyola University Chicago, Chicago, IL, USA.
Amniote skulls are diverse in shape and skeletal composition, which is the basis of much adaptive diversification within this clade. Major differences in skull shape are established early in development, at a critical developmental interval spanning the initial outgrowth and fusion of the facial processes. In birds, this is orchestrated by domains of Shh and Fgf8 expression, known as the frontonasal ectodermal zone (FEZ).
View Article and Find Full Text PDFNeuroimage
January 2025
Dept. Of Psychosomatic Medicine and Psychotherapy, Medical Faculty, Heinrich-Heine University Duesseldorf, Germany.
Despite their high prevalence, somatoform pain disorders are often not recognized early enough, not diagnosed reliably enough and not treated appropriately. Patients often experience a high level of suffering and the feeling of not being understood. For the medical care system, the symptoms represent a diagnostic and therapeutic challenge.
View Article and Find Full Text PDFDev Cogn Neurosci
January 2025
Experimental Psychology, Helmholtz Institute Utrecht University, Utrecht 3584CS, the Netherlands.
In response to Covid-19, western governments introduced policies that likely resulted in a reduced variety of facial input. This study investigated how this affected neural representations of face processing: speed of face processing; face categorization (differentiating faces from houses); and emotional face processing (differentiating happy, fearful, and neutral expressions), in infants (five or ten months old) and children (three years old). We compared participants tested before (total N = 462) versus during (total N = 473) the pandemic-related policies, and used electroencephalography to record brain activity.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!