Introduction: Weakened facial movements are early-stage symptoms of amyotrophic lateral sclerosis (ALS). ALS is generally detected based on changes in facial expressions, but large differences between individuals can lead to subjectivity in the diagnosis. We have proposed a computerized analysis of facial expression videos to detect ALS.
Methods: This study investigated the action units obtained from facial expression videos to differentiate between ALS patients and healthy individuals, identifying the specific action units and facial expressions that give the best results. We utilized the Toronto NeuroFace Dataset, which includes nine facial expression tasks for healthy individuals and ALS patients.
Results: The best classification accuracy was 0.91 obtained for the pretending to smile with tight lips expression.
Conclusion: This pilot study shows the potential of using computerized facial expression analysis based on action units to identify facial weakness symptoms in ALS.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11521450 | PMC |
http://dx.doi.org/10.1159/000540547 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!