Analysis of Various Facial Expressions of Horses as a Welfare Indicator Using Deep Learning.

Vet Sci

College of Veterinary Medicine, Kyungpook National University, Daegu 41566, Republic of Korea.

Published: April 2023

This study aimed to prove that deep learning can be effectively used for identifying various equine facial expressions as welfare indicators. In this study, a total of 749 horses (healthy: 586 and experiencing pain: 163) were investigated. Moreover, a model for recognizing facial expressions based on images and their classification into four categories, i.e., resting horses (RH), horses with pain (HP), horses immediately after exercise (HE), and horseshoeing horses (HH), was developed. The normalization of equine facial posture revealed that the profile (99.45%) had higher accuracy than the front (97.59%). The eyes-nose-ears detection model achieved an accuracy of 98.75% in training, 81.44% in validation, and 88.1% in testing, with an average accuracy of 89.43%. Overall, the average classification accuracy was high; however, the accuracy of pain classification was low. These results imply that various facial expressions in addition to pain may exist in horses depending on the situation, degree of pain, and type of pain experienced by horses. Furthermore, automatic pain and stress recognition would greatly enhance the identification of pain and other emotional states, thereby improving the quality of equine welfare.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10141195PMC
http://dx.doi.org/10.3390/vetsci10040283DOI Listing

Publication Analysis

Top Keywords

facial expressions
16
horses
8
deep learning
8
equine facial
8
pain
8
accuracy
5
analysis facial
4
expressions
4
expressions horses
4
horses welfare
4

Similar Publications

Recognition and classification of facial expression using artificial intelligence as a key of early detection in neurological disorders.

Rev Neurosci

January 2025

557765 Network of Neurosurgery and Artificial Intelligence (NONAI), Universal Scientific Education and Research Network (USERN ), Tehran, Iran.

The recognition and classification of facial expressions using artificial intelligence (AI) presents a promising avenue for early detection and monitoring of neurodegenerative disorders. This narrative review critically examines the current state of AI-driven facial expression analysis in the context of neurodegenerative diseases, such as Alzheimer's and Parkinson's. We discuss the potential of AI techniques, including deep learning and computer vision, to accurately interpret and categorize subtle changes in facial expressions associated with these pathological conditions.

View Article and Find Full Text PDF

Trust and rapport are essential abilities for human-robot interaction. Producing emotional expressions in the robots' faces is an effective way for that purpose. Androids can show human-like facial expressions of basic emotions.

View Article and Find Full Text PDF

Amniote skulls are diverse in shape and skeletal composition, which is the basis of much adaptive diversification within this clade. Major differences in skull shape are established early in development, at a critical developmental interval spanning the initial outgrowth and fusion of the facial processes. In birds, this is orchestrated by domains of Shh and Fgf8 expression, known as the frontonasal ectodermal zone (FEZ).

View Article and Find Full Text PDF

Despite their high prevalence, somatoform pain disorders are often not recognized early enough, not diagnosed reliably enough and not treated appropriately. Patients often experience a high level of suffering and the feeling of not being understood. For the medical care system, the symptoms represent a diagnostic and therapeutic challenge.

View Article and Find Full Text PDF

In response to Covid-19, western governments introduced policies that likely resulted in a reduced variety of facial input. This study investigated how this affected neural representations of face processing: speed of face processing; face categorization (differentiating faces from houses); and emotional face processing (differentiating happy, fearful, and neutral expressions), in infants (five or ten months old) and children (three years old). We compared participants tested before (total N = 462) versus during (total N = 473) the pandemic-related policies, and used electroencephalography to record brain activity.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!