Exploration of factors affecting webcam-based automated gaze coding.

Behav Res Methods

International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo Institutes for Advanced Study, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan.

Published: October 2024

Online experiments have been transforming the field of behavioral research, enabling researchers to increase sample sizes, access diverse populations, lower the costs of data collection, and promote reproducibility. The field of developmental psychology increasingly exploits such online testing approaches. Since infants cannot give explicit behavioral responses, one key outcome measure is infants' gaze behavior. In the absence of automated eyetrackers in participants' homes, automatic gaze classification from webcam data would make it possible to avoid painstaking manual coding. However, the lack of a controlled experimental environment may lead to various noise factors impeding automatic face detection or gaze classification. We created an adult webcam dataset that systematically reproduced noise factors from infant webcam studies which might affect automated gaze coding accuracy. We varied participants' left-right offset, distance to the camera, facial rotation, and the direction of the lighting source. Running two state-of-the-art classification algorithms (iCatcher+ and OWLET) revealed that facial detection performance was particularly affected by the lighting source, while gaze coding accuracy was consistently affected by the distance to the camera and lighting source. Morphing participants' faces to be unidentifiable did not generally affect the results, suggesting facial anonymization could be used when making online video data publicly available, for purposes of further study and transparency. Our findings will guide improving study design for infant and adult participants during online experiments. Moreover, training algorithms using our dataset will allow researchers to improve robustness and allow developmental psychologists to leverage online testing more efficiently.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11362184PMC
http://dx.doi.org/10.3758/s13428-024-02424-1DOI Listing

Publication Analysis

Top Keywords

gaze coding
12
lighting source
12
automated gaze
8
online experiments
8
online testing
8
gaze classification
8
noise factors
8
coding accuracy
8
distance camera
8
gaze
6

Similar Publications

Directional judgments of an arrow became slower when the direction and location were incongruent in a spatial Stroop task (i.e., a standard congruency effect).

View Article and Find Full Text PDF

When people discuss something that they can both see, their attention becomes increasingly coupled. Previous studies have found that this coupling is temporally asymmetric (e.g.

View Article and Find Full Text PDF

Social attention in the wild - fixations to the eyes and autistic traits during a naturalistic interaction in a healthy sample.

Sci Rep

December 2024

Department of Clinical and Health Psychology, Faculty of Psychology, University of Vienna, Liebiggasse 5, 1010, Vienna, Austria.

Article Synopsis
  • Attention to social stimuli is crucial for developing social skills, but most studies have used static images or videos that don’t fully reflect real-life interactions.
  • This research utilized mobile eye-tracking during face-to-face interviews to analyze gaze behavior in 62 participants and found that those with higher autistic traits tended to gaze less frequently at the eye area.
  • The study also examined the impact of different types of interview questions on gaze patterns and discussed potential future directions and limitations of their experimental setup.
View Article and Find Full Text PDF

Neural integration of egocentric and allocentric visual cues in the gaze system.

J Neurophysiol

November 2024

York Centre for Vision Research and Centre for Integrative and Applied Neuroscience, York University, Toronto, Ontario, Canada.

A fundamental question in neuroscience is how the brain integrates egocentric (body-centered) and allocentric (landmark-centered) visual cues, but for many years this question was ignored in sensorimotor studies. This changed in recent behavioral experiments, but the underlying physiology of ego / allocentric integration remained largely unstudied. The specific goal of this review is to explain how prefrontal neurons integrate eye-centered and landmark-centred visual codes for optimal gaze behavior.

View Article and Find Full Text PDF

Eye movements in daily life occur in rapid succession and often without a predefined goal. Using a free viewing task, we examined how fixation duration prior to a saccade correlates to visual saliency and neuronal activity in the superior colliculus (SC) at the saccade goal. Rhesus monkeys (three male) watched videos of natural, dynamic, scenes while eye movements were tracked and, simultaneously, neurons were recorded in the superficial and intermediate layers of the superior colliculus (SCs and SCi respectively), a midbrain structure closely associated with gaze, attention, and saliency coding.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!