Most affordable eye tracking systems use either intrusive setup such as head-mounted cameras or use fixed cameras with infrared corneal reflections via illuminators. In the case of assistive technologies, using intrusive eye tracking systems can be a burden to wear for extended periods of time and infrared based solutions generally do not work in all environments, especially outside or inside if the sunlight reaches the space. Therefore, we propose an eye-tracking solution using state-of-the-art convolutional neural network face alignment algorithms that is both accurate and lightweight for assistive tasks such as selecting an object for use with assistive robotics arms. This solution uses a simple webcam for gaze and face position and pose estimation. We achieve a much faster computation time than the current state-of-the-art while maintaining comparable accuracy. This paves the way for accurate appearance-based gaze estimation even on mobile devices, giving an average error of around 4.5°on the MPIIGaze dataset [1] and state-of-the-art average errors of 3.9°and 3.3°on the UTMultiview [2] and GazeCapture [3], [4] datasets respectively, while achieving a decrease in computation time of up to 91%.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNSRE.2023.3236886DOI Listing

Publication Analysis

Top Keywords

eye tracking
12
assistive technologies
8
tracking systems
8
computation time
8
non-intrusive real
4
time
4
real time
4
time eye
4
tracking facial
4
facial alignment
4

Similar Publications

Background: Clinical decision support systems leveraging artificial intelligence (AI) are increasingly integrated into health care practices, including pharmacy medication verification. Communicating uncertainty in an AI prediction is viewed as an important mechanism for boosting human collaboration and trust. Yet, little is known about the effects on human cognition as a result of interacting with such types of AI advice.

View Article and Find Full Text PDF

Navigating unfamiliar environments poses significant challenges, especially for individuals with cognitive impairments. These individuals often struggle with maintaining orientation, recalling routines, and traveling through new environments due to their limited cognitive capacity. The current state of research on visual environmental attributes of wayfinding reveals a gap, particularly regarding individuals with mild cognitive impairment (MCI), compared to healthy older adults.

View Article and Find Full Text PDF

Background: While alcohol has been shown to impair eye movements in young adults, little is known about alcohol-induced oculomotor impairment in older adults with longer histories of alcohol use. Here, we examined whether older adults with chronic alcohol use disorder (AUD) exhibit more acute tolerance than age-matched light drinkers (LD), evidenced by less alcohol-induced oculomotor impairment and perceived impairment.

Method: Two random-order, double-blinded laboratory sessions with administration of alcohol (0.

View Article and Find Full Text PDF

Characterization of Optokinetic Nystagmus in Healthy Participants With a Novel Oculography Device.

Otolaryngol Head Neck Surg

January 2025

Department of Otolaryngology-Head and Neck Surgery, Massachusetts Eye and Ear, Harvard Medical School, Boston, Massachusetts, USA.

Objective: To develop a proof-of-concept smart-phone-based eye-tracking algorithm to assess non-pathologic optokinetic (OKN) nystagmus in healthy participants. Current videonystagmography (VNG) is typically restricted to in-office use, and advances in portable vestibular diagnostics would yield immense public health benefits.

Study Design: Prospective cohort study.

View Article and Find Full Text PDF

Eye movement detection algorithms (e.g., I-VT) require the selection of thresholds to identify eye fixations and saccadic movements from gaze data.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!