The integration of gaze/eye tracking into virtual and augmented reality devices has unlocked new possibilities, offering a novel human-computer interaction (HCI) modality for on-device extended reality (XR). Emerging applications in XR, such as low-effort user authentication, mental health diagnosis, and foveated rendering, demand real-time eye tracking at high frequencies, a capability that current solutions struggle to deliver. To address this challenge, we present EX-Gaze, an event-based real-time eye tracking system designed for on-device extended reality. EX-Gaze achieves a high tracking frequency of 2KHz, providing decent accuracy and low tracking latency. The exceptional tracking frequency of EX-Gaze is achieved through the use of event cameras, cutting-edge, bio-inspired vision hardware that delivers event-stream output at high temporal resolution. We have developed a lightweight tracking framework that enables real-time pupil region localization and tracking on mobile devices. To effectively leverage the sparse nature of event-streams, we introduce the sparse event-patch representation and the corresponding sparse event patches transformer as key components to reduce computational time. Implemented on Jetson Orin Nano, a low-cost, small-sized mobile device with hybrid GPU and CPU components capable of parallel processing of multiple deep neural networks, EX-Gaze maximizes the computation power of Jetson Orin Nano through sophisticated computation scheduling and offloading between GPUs and CPUs. This enables EX-Gaze to achieve real-time tracking at 2KHz without accumulating latency. Evaluation on public datasets demonstrates that EX-Gaze outperforms other event-based eye tracking methods by striking the best balance between accuracy and efficiency on mobile devices. These results highlight EX-Gaze's potential as a groundbreaking technology to support XR applications that require high-frequency and real-time eye tracking. The code is available at https://github.com/Ningreka/EX-Gaze.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TVCG.2025.3549565 | DOI Listing |
Wearable Technol
February 2025
Department of Human Centered Design, Cornell University, Ithaca, NY, USA.
Real-time measurement of head rotation, a primary human body movement, offers potential advantages in rehabilitating head or neck motor disorders, promoting seamless human-robot interaction, and tracking the lateral glance of children with autism spectrum disorder for effective intervention. However, existing options such as cameras capturing the entire face or skin-attached sensors have limitations concerning privacy, safety, and/or usability. This research introduces a novel method that employs a battery-free RFID tag-based wearable sensor for monitoring head orientation, as a substitute for the existing options like camera.
View Article and Find Full Text PDFWearable Technol
March 2025
Department of Mechanical Engineering, Northwestern University, Evanston, IL, USA.
Eye gaze tracking is increasingly popular due to improved technology and availability. In the domain of assistive device control, however, eye gaze tracking is often used in discrete ways (e.g.
View Article and Find Full Text PDFCogn Neurodyn
December 2025
Department of Electronic and Information Engineering, Tokyo University of Agriculture and Technology, Koganei-shi, Tokyo, 184-8588 Japan.
Unlabelled: Face masks became a part of everyday life during the SARS-CoV-2 pandemic. Previous studies showed that the face cognition mechanism involves holistic face processing, and the absence of face features could lower the cognition ability. This is opposed to the experience during the pandemic, when people could correctly recognize faces, although the mask covered a part of the face.
View Article and Find Full Text PDFJ Psychiatr Res
February 2025
Department of Psychiatry, Dalhousie University, Halifax, NS, Canada. Electronic address:
While attentional biases towards negative stimuli have previously been linked to the development and maintenance of anxiety disorders, a current limitation of this research involves the use of static images for stimuli, as they cannot adequately depict the dynamic nature of real-life interactions. Since attentional biases in those with elevated anxiety remain understudied using more naturalistic stimuli, such as dynamic social videos, the purpose of this explorative study was to use novel dynamic stimuli and modern eye-tracking equipment to further investigate negative attentional biases in anxious emerging, female adults. Non-clinical participants (N = 62; mean age = 20.
View Article and Find Full Text PDFeNeuro
March 2025
Cognitive, Linguistic, and Psychological Sciences, Brown university, Providence, RI, U.S.A.
Anterior-posterior interactions in the alpha band (8-12 Hz) have been implicated in a variety of functions including perception, attention, and working memory. The underlying neural communication can be flexibly controlled by adjusting phase relations when activities across anterior-posterior regions oscillate at a matched frequency. We thus investigated how alpha oscillation frequencies spontaneously converged along anterior-posterior regions by tracking oscillatory EEG activity while participants rested.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!