Most robots are programmed to carry out specific tasks routinely with minor variations. However, more and more applications from SMEs require robots work alongside their counterpart human workers. To smooth the collaboration task flow and improve the collaboration efficiency, a better way is to formulate the robot to surmise what kind of assistance a human coworker needs and naturally take the right action at the right time. This paper proposes a prediction-based human-robot collaboration model for assembly scenarios. An embedded learning from demonstration technique enables the robot to understand various task descriptions and customized working preferences. A state-enhanced convolutional long short-term memory (ConvLSTM)-based framework is formulated for extracting the high-level spatiotemporal features from the shared workspace and predicting the future actions to facilitate the fluent task transition. This model allows the robot to adapt itself to predicted human actions and enables proactive assistance during collaboration. We applied our model to the seats assembly experiment for a scale model vehicle and it can obtain a human worker's intentions, predict a coworker's future actions, and provide assembly parts correspondingly. It has been verified that the proposed framework yields higher smoothness and shorter idle times, and meets more working styles, compared to the state-of-the-art methods without prediction awareness.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9185262PMC
http://dx.doi.org/10.3390/s22114279DOI Listing

Publication Analysis

Top Keywords

prediction-based human-robot
8
human-robot collaboration
8
learning demonstration
8
future actions
8
collaboration
5
model
5
assembly
4
collaboration assembly
4
assembly tasks
4
tasks learning
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!