We investigated the metrics and kinematics of human eye-head gaze shifts using the anti-gaze shift task. Surprisingly, no systematic difference was found between peak gaze velocities of large pro- and anti-gaze shifts. In a follow-up experiment that equated perceived stimulus luminance across multiple eccentricities, pro-gaze shifts were consistently faster than anti-gaze shifts. In both experiments, we did not observe any head-only errors where initial head motion dissociates from gaze direction, even though many subjects generated such movements in other paradigms. These experiments confirm the influence of stimulus luminance on comparative movement velocity, and demonstrate that the behavioural set assumed in this task discourages head-only errors.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.visres.2007.11.014 | DOI Listing |
J Vis
January 2025
Magic Leap Switzerland GmbH, Zürich, Switzerland.
When rendering the visual scene for near-eye head-mounted displays, accurate knowledge of the geometry of the displays, scene objects, and eyes is required for the correct generation of the binocular images. Despite possible design and calibration efforts, these quantities are subject to positional and measurement errors, resulting in some misalignment of the images projected to each eye. Previous research investigated the effects in virtual reality (VR) setups that triggered such symptoms as eye strain and nausea.
View Article and Find Full Text PDFToxicol Rep
December 2024
Bioinformatics and Molecular Medicine Research Group, Dow Fly Research Lab and Stock Centre, Dow College of Biotechnology, Dow University of Health Sciences, Karachi 75280, Pakistan.
Antibiotics are the major therapeutic arsenal against bacterial infections. Yet, beneath this medical triumph lies an under investigated challenge of the potential teratological and toxicological impacts associated with the use of antibiotics. In the present study, we have explored the teratogenic potential of five commonly used antibiotics (streptomycin, metronidazole, tigecycline, doxycycline and norfloxacin) on Oregon-R strain.
View Article and Find Full Text PDFJ Vis
September 2024
Center for Perceptual Systems, The University of Texas at Austin, Austin, TX, USA.
Most research on visual search has used simple tasks presented on a computer screen. However, in natural situations visual search almost always involves eye, head, and body movements in a three-dimensional (3D) environment. The different constraints imposed by these two types of search tasks might explain some of the discrepancies in our understanding concerning the use of memory resources and the role of contextual objects during search.
View Article and Find Full Text PDFIEEE Trans Vis Comput Graph
June 2024
Human eye gaze plays a significant role in many virtual and augmented reality (VR/AR) applications, such as gaze-contingent rendering, gaze-based interaction, or eye-based activity recognition. However, prior works on gaze analysis and prediction have only explored eye-head coordination and were limited to human-object interactions. We first report a comprehensive analysis of eye-body coordination in various human-object and human-human interaction activities based on four public datasets collected in real-world (MoGaze), VR (ADT), as well as AR (GIMO and EgoBody) environments.
View Article and Find Full Text PDFbioRxiv
November 2024
Cortical Systems & Behavior Lab, University of California San Diego, San Diego, California, USA.
Our understanding of how vision functions as primates actively navigate the real-world is remarkably sparse. As most data have been limited to chaired and typically head-restrained animals, the synergistic interactions of different motor actions/plans inherent to active sensing - e.g.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!