Introduction: Weakened facial movements are early-stage symptoms of amyotrophic lateral sclerosis (ALS). ALS is generally detected based on changes in facial expressions, but large differences between individuals can lead to subjectivity in the diagnosis. We have proposed a computerized analysis of facial expression videos to detect ALS.

Methods: This study investigated the action units obtained from facial expression videos to differentiate between ALS patients and healthy individuals, identifying the specific action units and facial expressions that give the best results. We utilized the Toronto NeuroFace Dataset, which includes nine facial expression tasks for healthy individuals and ALS patients.

Results: The best classification accuracy was 0.91 obtained for the pretending to smile with tight lips expression.

Conclusion: This pilot study shows the potential of using computerized facial expression analysis based on action units to identify facial weakness symptoms in ALS.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11521450PMC
http://dx.doi.org/10.1159/000540547DOI Listing

Publication Analysis

Top Keywords

facial expression
16
action units
12
amyotrophic lateral
8
lateral sclerosis
8
facial
8
facial expressions
8
expression videos
8
units facial
8
healthy individuals
8
als
5

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!