There is a growing interest in the use of webcams to conduct eye-tracking experiments over the internet. We assessed the performance of two webcam-based eye-tracking techniques for behavioral research: manual annotation of webcam videos () and the automated WebGazer eye-tracking algorithm. We compared these methods to a traditional infrared eye-tracker and assessed their performance in both lab and web-based settings. In both lab and web experiments, participants completed the same battery of five tasks, selected to trigger effects of various sizes: two visual fixation tasks and three visual world tasks testing real-time (psycholinguistic) processing effects. In the lab experiment, we simultaneously collected infrared eye-tracking, manual eye-tracking, and WebGazer data; in the web experiment, we simultaneously collected manual eye-tracking and WebGazer data. We found that the two webcam-based methods are suited to capture different types of eye-movement patterns. Manual eye-tracking, similar to infrared eye-tracking, detected both large and small effects. WebGazer, however, showed less accuracy in detecting short, subtle effects. There was no notable effect of setting for either method. We discuss the trade-offs researchers face when choosing eye-tracking methods and offer advice for conducting eye-tracking experiments over the internet.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11627531 | PMC |
http://dx.doi.org/10.1162/opmi_a_00171 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!