With the rapid development of virtual reality (VR) technology and the market growth of social network services (SNS), VR-based SNS have been actively developed, in which 3D avatars interact with each other on behalf of the users. To provide the users with more immersive experiences in a metaverse, facial recognition technologies that can reproduce the user's facial gestures on their personal avatar are required. However, it is generally difficult to employ traditional camera-based facial tracking technology to recognize the facial expressions of VR users because a large portion of the user's face is occluded by a VR head-mounted display (HMD). To address this issue, attempts have been made to recognize users' facial expressions based on facial electromyogram (fEMG) recorded around the eyes. fEMG-based facial expression recognition (FER) technology requires only tiny electrodes that can be readily embedded in the HMD pad that is in contact with the user's facial skin. Additionally, electrodes recording fEMG signals can simultaneously acquire electrooculogram (EOG) signals, which can be used to track the user's eyeball movements and detect eye blinks. In this study, we implemented an fEMG- and EOG-based FER system using ten electrodes arranged around the eyes, assuming a commercial VR HMD device. Our FER system could continuously capture various facial motions, including five different lip motions and two different eyebrow motions, from fEMG signals. Unlike previous fEMG-based FER systems that simply classified discrete expressions, with the proposed FER system, natural facial expressions could be continuously projected on the 3D avatar face using machine-learning-based regression with a new concept named the virtual blend shape weight, making it unnecessary to simultaneously record fEMG and camera images for each user. An EOG-based eye tracking system was also implemented for the detection of eye blinks and eye gaze directions using the same electrodes. These two technologies were simultaneously employed to implement a real-time facial motion capture system, which could successfully replicate the user's facial expressions on a realistic avatar face in real time. To the best of our knowledge, the concurrent use of fEMG and EOG for facial motion capture has not been reported before.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10099104PMC
http://dx.doi.org/10.3390/s23073580DOI Listing

Publication Analysis

Top Keywords

facial expressions
16
facial
15
facial motion
12
motion capture
12
user's facial
12
fer system
12
capture system
8
based facial
8
facial electromyogram
8
virtual reality
8

Similar Publications

Introduction: Systemic lupus erythematosus (SLE) is a chronic inflammatory autoimmune disease that affects various body systems, including the skin and facial features. Estrogen promotes lupus in human and mouse models of SLE. In this study, we conducted an in vivo study to investigate the relationship between two estrogen receptors (ERα and ERβ) and platelet-activating factor acetylhydrolase (PAF-AH) on the symptoms of SLE.

View Article and Find Full Text PDF

Emotion perception is a fundamental aspect of our lives because others' emotions may provide important information about their reactions, attitudes, intentions, and behavior. Following the seminal work of Ekman, much of the research on emotion perception has focused on facial expressions. Recent evidence suggests, however, that facial expressions may be more ambiguous than previously assumed and that context also plays an important role in deciphering the emotional states of others.

View Article and Find Full Text PDF

There has been an increased interest in standardized approaches to coding facial movement in mammals. Such approaches include Facial Action Coding Systems (FACS), where individuals are trained to identify discrete facial muscle movements that combine to create a facial configuration. Some studies have utilized FACS to analyze facial signaling, recording the quantity of morphologically distinct facial signals a species can generate.

View Article and Find Full Text PDF

This study examined the effects of treadmill running (TR) regimens on craniofacial pain- and anxiety-like behaviors, as well as their effects on neural changes in specific brain regions of male mice subjected to repeated social defeat stress (SDS) for 10 days. Behavioral and immunohistochemical experiments were conducted to evaluate the impact of TR regimens on SDS-related those behaviors, as well as epigenetic and neural activity markers in the anterior cingulate cortex (ACC), insular cortex (IC), rostral ventromedial medulla (RVM), and cervical spinal dorsal horn (C2). Behavioral responses were quantified using multiple tests, while immunohistochemistry measured histone H3 acetylation, histone deacetylases (HDAC1, HDAC2), and neural activity markers (FosB and phosphorylated cAMP response element-binding protein (pCREB).

View Article and Find Full Text PDF

Craniofacial development gives rise to the complex structures of the face and involves the interplay of diverse cell types. Despite its importance, our understanding of human-specific craniofacial developmental mechanisms and their genetic underpinnings remains limited. Here, we present a comprehensive single-nucleus RNA sequencing (snRNA-seq) atlas of human craniofacial development from craniofacial tissues of 24 embryos that span six key time points during the embryonic period (4-8 post-conception weeks).

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!