A human-in-the-loop system is proposed to enable collaborative manipulation tasks for person with physical disabilities. Studies show that the cognitive burden of subject reduces with increased autonomy of assistive system. Our framework obtains high-level intent from the user to specify manipulation tasks. The system processes sensor input to interpret the user's environment. Augmented reality glasses provide ego-centric visual feedback of the interpretation and summarize robot affordances on a menu. A tongue drive system serves as the input modality for triggering a robotic arm to execute the tasks. Assistance experiments compare the system to Cartesian control and to state-of-the-art approaches. Our system achieves competitive results with faster completion time by simplifying manipulation tasks.

Download full-text PDF

Source
http://dx.doi.org/10.1109/EMBC.2018.8512668DOI Listing

Publication Analysis

Top Keywords

manipulation tasks
12
augmented reality
8
system
6
helping hand
4
hand assistive
4
manipulation
4
assistive manipulation
4
manipulation framework
4
framework augmented
4
reality tongue-drive
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!