A light-weight, wearable, wireless gaze tracker with integrated selection command source for human-computer interaction is introduced. The prototype system combines head-mounted, video-based gaze tracking with capacitive facial movement detection that enable multimodal interaction by gaze pointing and making selections with facial gestures. The system is targeted mainly to disabled people with limited mobility over their hands. The hardware was made wireless to remove the need to take off the device when moving away from the computer, and to allow future use in more mobile contexts. The algorithms responsible for determining the eye and head orientations to map gaze direction to on-screen coordinates are presented together with the one to detect movements from the measured capacitance signal. Point-and-click experiments were conducted to assess the performance of the multimodal system. The results show decent performance in laboratory and office conditions. The overall point-and-click accuracy in the multimodal experiments is comparable to the errors in previous research on head-mounted, single modality gaze tracking that does not compensate for changes in head orientation.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TITB.2011.2158321DOI Listing

Publication Analysis

Top Keywords

wearable wireless
8
wireless gaze
8
gaze tracker
8
tracker integrated
8
integrated selection
8
selection command
8
command source
8
source human-computer
8
human-computer interaction
8
gaze tracking
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!