Visual scanning is achieved via head motion and gaze movement for visual information acquisition and cognitive processing, which plays a critical role in undertaking common sensorimotor tasks such as driving. The coordination of the head and eyes is an important human behavior to make a key contribution to goal-directed visual scanning and sensorimotor driving. In this paper, we basically investigate the two most common patterns in eye-head coordination: "head motion earlier than eye movement" and "eye movement earlier than head motion". We utilize bidirectional transfer entropies between head motion and eye movements to determine the existence of these two eye-head coordination patterns. Furthermore, we propose a unidirectional information difference to assess which pattern predominates in head-eye coordination. Additionally, we have discovered a significant correlation between the normalized unidirectional information difference and driving performance. This result not only indicates the influence of eye-head coordination on driving behavior from a computational perspective but also validates the practical significance of our approach utilizing transfer entropy for quantifying eye-head coordination.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11154336PMC
http://dx.doi.org/10.3390/e26010003DOI Listing

Publication Analysis

Top Keywords

eye-head coordination
16
head motion
12
transfer entropies
8
entropies head
8
motion eye
8
visual scanning
8
unidirectional difference
8
coordination
6
head
5
driving
5

Similar Publications

Most research on visual search has used simple tasks presented on a computer screen. However, in natural situations visual search almost always involves eye, head, and body movements in a three-dimensional (3D) environment. The different constraints imposed by these two types of search tasks might explain some of the discrepancies in our understanding concerning the use of memory resources and the role of contextual objects during search.

View Article and Find Full Text PDF

Human eye gaze plays a significant role in many virtual and augmented reality (VR/AR) applications, such as gaze-contingent rendering, gaze-based interaction, or eye-based activity recognition. However, prior works on gaze analysis and prediction have only explored eye-head coordination and were limited to human-object interactions. We first report a comprehensive analysis of eye-body coordination in various human-object and human-human interaction activities based on four public datasets collected in real-world (MoGaze), VR (ADT), as well as AR (GIMO and EgoBody) environments.

View Article and Find Full Text PDF

Visual motion information plays an important role in the control of movements in sports. Skilled ball players are thought to acquire accurate visual information by using an effective visual search strategy with eye and head movements. However, differences in catching ability and gaze movements due to sports experience and expertise have not been clarified.

View Article and Find Full Text PDF

The posterior parietal cortex (PPC) integrates multisensory and motor-related information for generating and updating body representations and movement plans. We used retrograde transneuronal transfer of rabies virus combined with a conventional tracer in macaque monkeys to identify direct and disynaptic pathways to the arm-related rostral medial intraparietal area (MIP), the ventral lateral intraparietal area (LIPv), belonging to the parietal eye field, and the pursuit-related lateral subdivision of the medial superior temporal area (MSTl). We found that these areas receive major disynaptic pathways via the thalamus from the nucleus of the optic tract (NOT) and the superior colliculus (SC), mainly ipsilaterally.

View Article and Find Full Text PDF

Motor "laziness" constrains fixation selection in real-world tasks.

Proc Natl Acad Sci U S A

March 2024

Reality Labs Research, Meta Platforms Inc., Redmond, WA 98052.

Humans coordinate their eye, head, and body movements to gather information from a dynamic environment while maximizing reward and minimizing biomechanical and energetic costs. However, such natural behavior is not possible in traditional experiments employing head/body restraints and artificial, static stimuli. Therefore, it is unclear to what extent mechanisms of fixation selection discovered in lab studies, such as inhibition-of-return (IOR), influence everyday behavior.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!