Improving prosthetic hand functionality is critical in reducing abandonment rates and improving the amputee's quality of life. Techniques such as joint force estimation and gesture recognition using myoelectric signals could enable more realistic control of the prosthetic hand. To accelerate the translation of these advanced control strategies from lab to clinic, We created a virtual prosthetic control environment that enables rich user interactions and dexterity evaluation. The virtual environment is made of two parts, namely the Unity scene for rendering and user interaction, and a Python back-end to support accurate physics simulation and communication with control algorithms. By utilizing the built-in tracking capabilities of a virtual reality headset, the user can visualize and manipulate a virtual hand without additional motion tracking setups. In the virtual environment, we demonstrate actuation of the prosthetic hand through decoded EMG signal streaming, hand tracking, and the use of a VR controller. By providing a flexible platform to investigate different control modalities, we believe that our virtual environment will allow for faster experimentation and further progress in clinical translation.

Download full-text PDF

Source
http://dx.doi.org/10.1109/EMBC46164.2021.9630555DOI Listing

Publication Analysis

Top Keywords

prosthetic hand
16
virtual environment
12
virtual reality
8
control strategies
8
virtual
7
hand
6
control
6
prosthetic
5
reality evaluating
4
evaluating prosthetic
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!