Human teams are able to easily perform collaborative manipulation tasks. However, simultaneously manipulating a large extended object for a robot and human is a difficult task due to the inherent ambiguity in the desired motion. Our approach in this paper is to leverage data from human-human dyad experiments to determine motion intent for a physical human-robot co-manipulation task. We do this by showing that the human-human dyad data exhibits distinct torque triggers for a lateral movement. As an alternative intent estimation method, we also develop a deep neural network based on motion data from human-human trials to predict future trajectories based on past object motion. We then show how force and motion data can be used to determine robot control in a human-robot dyad. Finally, we compare human-human dyad performance to the performance of two controllers that we developed for human-robot co-manipulation. We evaluate these controllers in three-degree-of-freedom planar motion where determining if the task involves rotation or translation is ambiguous.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10894988 | PMC |
http://dx.doi.org/10.3389/fnbot.2024.1291694 | DOI Listing |
Front Neurorobot
February 2024
Robotics and Dynamics Laboratory, Brigham Young University, Mechanical Engineering, Provo, UT, United States.
Human teams are able to easily perform collaborative manipulation tasks. However, simultaneously manipulating a large extended object for a robot and human is a difficult task due to the inherent ambiguity in the desired motion. Our approach in this paper is to leverage data from human-human dyad experiments to determine motion intent for a physical human-robot co-manipulation task.
View Article and Find Full Text PDFIEEE Trans Neural Syst Rehabil Eng
October 2023
While treating sensorimotor impairments, a therapist may provide physical assistance by guiding their patient's limb to teach a desired movement. In this scenario, a key aspect is the compliance of the interaction, as the therapist can provide subtle cues or impose a movement as demonstration. One approach to studying these interactions involves haptically connecting two individuals through robotic interfaces.
View Article and Find Full Text PDFJ Neuroeng Rehabil
December 2021
Department of Mechanical Engineering, McCormick School of Engineering, Northwestern University, 60208, Evanston, IL, USA.
Background: Human-human (HH) interaction mediated by machines (e.g., robots or passive sensorized devices), which we call human-machine-human (HMH) interaction, has been studied with increasing interest in the last decade.
View Article and Find Full Text PDFAnnu Int Conf IEEE Eng Med Biol Soc
November 2021
Dyads are couples of collaborative humans that perform a task together while mechanically connected by a robot. As shown in different studies [1] [2], haptic interaction can be beneficial for motor performance so that the dyad outperforms the subject executing the task alone. These achievements are hypothesized to be the result of the haptic communication engaged between the subjects that triggers internal forward models.
View Article and Find Full Text PDFAnnu Int Conf IEEE Eng Med Biol Soc
November 2021
Little is known about how two people physically coupled together (a dyad) can accomplish tasks. In a pilot study we tested how healthy inexperienced and experienced dyads learn to repeatedly reach to a target and stop while challenged with a 30 degree visuomotor rotation. We employed the Pantograph investigational device that haptically couples partners movements while providing cursor feedback, and we measured the amount and speed of learning to test a prevailing hypothesis: dyads with no experience learn faster than an experienced person coupled with a novice.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!