In recent years the advances in Artificial Intelligence (AI) have been seen to play an important role in human well-being, in particular enabling novel forms of human-computer interaction for people with a disability. In this paper, we propose a sEMG-controlled 3D game that leverages a deep learning-based architecture for real-time gesture recognition. The 3D game experience developed in the study is focused on rehabilitation exercises, allowing individuals with certain disabilities to use low-cost sEMG sensors to control the game experience. For this purpose, we acquired a novel dataset of seven gestures using the Myo armband device, which we utilized to train the proposed deep learning model. The signals captured were used as an input of a Conv-GRU architecture to classify the gestures. Further, we ran a live system with the participation of different individuals and analyzed the neural network's classification for hand gestures. Finally, we also evaluated our system, testing it for 20 rounds with new participants and analyzed its results in a user study.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7696342PMC
http://dx.doi.org/10.3390/s20226451DOI Listing

Publication Analysis

Top Keywords

semg-controlled game
8
gesture recognition
8
deep learning
8
game experience
8
game rehabilitation
4
rehabilitation therapies
4
therapies real-time
4
real-time time
4
time hand
4
hand gesture
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!