3D-ARM-Gaze is a public dataset designed to provide natural arm movements together with visual and gaze information when reaching objects in a wide reachable space from a precisely controlled, comfortably seated posture. Participants were involved in picking and placing objects in various positions and orientations in a virtual environment, whereby a specific procedure maximized the workspace explored while ensuring a consistent seated posture by guiding participants to a predetermined neutral posture via visual feedback from the trunk and shoulders. These experimental settings enabled to capture natural arm movements with high median success rates (>98% objects reached) and minimal compensatory movements. The dataset regroups more than 2.5 million samples recorded from 20 healthy participants performing 14 000 single pick-and-place movements (700 per participant). While initially designed to explore novel prosthesis control strategies based on natural eye-hand and arm coordination, this dataset will also be useful to researchers interested in core sensorimotor control, humanoid robotics, human-robot interactions, as well as for the development and testing of associated solutions in gaze-guided computer vision.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11364860 | PMC |
http://dx.doi.org/10.1038/s41597-024-03765-4 | DOI Listing |
Sci Data
August 2024
Univ. Bordeaux, CNRS, INCIA, UMR 5287, F-33000, Bordeaux, France.
3D-ARM-Gaze is a public dataset designed to provide natural arm movements together with visual and gaze information when reaching objects in a wide reachable space from a precisely controlled, comfortably seated posture. Participants were involved in picking and placing objects in various positions and orientations in a virtual environment, whereby a specific procedure maximized the workspace explored while ensuring a consistent seated posture by guiding participants to a predetermined neutral posture via visual feedback from the trunk and shoulders. These experimental settings enabled to capture natural arm movements with high median success rates (>98% objects reached) and minimal compensatory movements.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!