Reliable detection of ordinary facial expressions (e.g. smile) despite the variability among individuals as well as face appearance is an important step toward the realization of perceptual user interface with autonomous perception of persons. We describe a rule-based algorithm for robust facial expression recognition combined with robust face detection using a convolutional neural network. In this study, we address the problem of subject independence as well as translation, rotation, and scale invariance in the recognition of facial expression. The result shows reliable detection of smiles with recognition rate of 97.6% for 5600 still images of more than 10 subjects. The proposed algorithm demonstrated the ability to discriminate smiling from talking based on the saliency score obtained from voting visual cues. To the best of our knowledge, it is the first facial expression recognition model with the property of subject independence combined with robustness to variability in facial appearance.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/S0893-6080(03)00115-1 | DOI Listing |
The current state of mental health treatment for individuals diagnosed with major depressive disorder leaves billions of individuals with first-line therapies that are ineffective or burdened with undesirable side effects. One major obstacle is that distinct pathologies may currently be diagnosed as the same disease and prescribed the same treatments. The key to developing antidepressants with ubiquitous efficacy is to first identify a strategy to differentiate between heterogeneous conditions.
View Article and Find Full Text PDFWorld J Biol Psychiatry
January 2025
P1vital, Wallingford, UK.
Objectives: While neuropsychological effects of conventional antidepressants are well-documented, more research is needed for rapid-acting antidepressants. This study examines the effects of esketamine on emotion processing and cognitive functioning, both acutely and sub-chronically.
Methods: Eighteen treatment-resistant depression (TRD) patients received repeated intravenous esketamine infusions.
Behav Res Methods
January 2025
Department of Psychology, University of Bath, Claverton Down, Bath, BA2 7AY, UK.
Measuring attention and engagement is essential for understanding a wide range of psychological phenomena. Advances in technology have made it possible to measure real-time attention to naturalistic stimuli, providing ecologically valid insight into temporal dynamics. We developed a research protocol called Trace, which records anonymous facial landmarks, expressions, and patterns of movement associated with engagement in screen-based media.
View Article and Find Full Text PDFJ Dent
January 2025
Clinic of General-, Special Care- and Geriatric Dentistry, Center for Dental Medicine, University of Zurich, Zurich, Switzerland. Electronic address:
Objectives: The study aimed to assess the prevalence and nature of emotional expressions in care-dependent older adults using an automated face coding (AFC) software. By examining the seven fundamental emotions, the study sought to understand how these emotions manifest and their potential implications for dental care in this population.
Methods: Fifty care-dependent older adults' (mean-age: 78.
J Affect Disord
January 2025
Center for Functional Neurosurgery, Department of Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China. Electronic address:
Background: Parkinson's disease (PD) is primarily characterized by motor symptoms, but patients also experience a relatively high prevalence of non-motor symptoms, including emotional and cognitive impairments. While the subthalamic nucleus (STN) is a common target for deep brain stimulation to treat motor symptoms in PD, its role in emotion processing is still under investigation. This study examines the subthalamic neural oscillatory activities during facial emotion processing and its association with affective characteristics.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!