3D-ARM-Gaze is a public dataset designed to provide natural arm movements together with visual and gaze information when reaching objects in a wide reachable space from a precisely controlled, comfortably seated posture. Participants were involved in picking and placing objects in various positions and orientations in a virtual environment, whereby a specific procedure maximized the workspace explored while ensuring a consistent seated posture by guiding participants to a predetermined neutral posture via visual feedback from the trunk and shoulders. These experimental settings enabled to capture natural arm movements with high median success rates (>98% objects reached) and minimal compensatory movements. The dataset regroups more than 2.5 million samples recorded from 20 healthy participants performing 14 000 single pick-and-place movements (700 per participant). While initially designed to explore novel prosthesis control strategies based on natural eye-hand and arm coordination, this dataset will also be useful to researchers interested in core sensorimotor control, humanoid robotics, human-robot interactions, as well as for the development and testing of associated solutions in gaze-guided computer vision.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11364860PMC
http://dx.doi.org/10.1038/s41597-024-03765-4DOI Listing

Publication Analysis

Top Keywords

3d-arm-gaze public
8
public dataset
8
natural arm
8
arm movements
8
seated posture
8
movements
5
dataset
4
arm
4
dataset arm
4
arm reaching
4

Similar Publications

3D-ARM-Gaze is a public dataset designed to provide natural arm movements together with visual and gaze information when reaching objects in a wide reachable space from a precisely controlled, comfortably seated posture. Participants were involved in picking and placing objects in various positions and orientations in a virtual environment, whereby a specific procedure maximized the workspace explored while ensuring a consistent seated posture by guiding participants to a predetermined neutral posture via visual feedback from the trunk and shoulders. These experimental settings enabled to capture natural arm movements with high median success rates (>98% objects reached) and minimal compensatory movements.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!