Our understanding of how vision functions as primates actively navigate the real-world is remarkably sparse. As most data have been limited to chaired and typically head-restrained animals, the synergistic interactions of different motor actions/plans inherent to active sensing - e.g. eyes, head, posture, movement, etc. - on visual perception are largely unknown. To address this considerable gap in knowledge, we developed an innovative wireless head-mounted eye tracking system called for small mammals, such as marmoset monkeys. Our system performs hair-free ye-cording using ackpack mounted miccontrollers. Because eye illumination and environment lighting change continuously in natural contexts, we developed a segmentation artificial neural network to perform robust pupil tracking in these conditions. Leveraging this innovative system to investigate active vision, we demonstrate that although freely-moving marmosets exhibit frequent compensatory eye movements equivalent to other primates, including humans, the predictability of the visual behavior (gaze) is higher when animals are freely-moving relative to when they are head-fixed. Moreover, despite increases in eye/head-motion during locomotion, gaze stabilization remains steady because of an increase in VOR gain during locomotion. These results demonstrate the efficient, dynamic visuo-motor mechanisms and related behaviors that enable stable, high-resolution foveal vision in primates as they explore the natural world.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11100783PMC
http://dx.doi.org/10.1101/2024.05.11.593707DOI Listing

Publication Analysis

Top Keywords

active vision
8
head-mounted eye
8
eye tracking
8
vision freely
4
freely moving
4
moving marmosets
4
marmosets head-mounted
4
eye
4
tracking understanding
4
understanding vision
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!