3D-ARM-Gaze is a public dataset designed to provide natural arm movements together with visual and gaze information when reaching objects in a wide reachable space from a precisely controlled, comfortably seated posture. Participants were involved in picking and placing objects in various positions and orientations in a virtual environment, whereby a specific procedure maximized the workspace explored while ensuring a consistent seated posture by guiding participants to a predetermined neutral posture via visual feedback from the trunk and shoulders. These experimental settings enabled to capture natural arm movements with high median success rates (>98% objects reached) and minimal compensatory movements.
View Article and Find Full Text PDF