The ultimate goal of machine learning-based myoelectric control is simultaneous and independent control of multiple degrees of freedom (DOFs), including wrist and digit artificial joints. For prosthetic finger control, regression-based methods are typically used to reconstruct position/velocity trajectories from surface electromyogram (EMG) signals. Unfortunately, such methods have thus far met with limited success. In this work, we propose action decoding, a paradigm-shifting approach for independent, multi-digit movement intent prediction based on multi-output, multi-class classification. At each moment in time, our algorithm decodes movement intent for each available DOF into one of three classes: open, close, or stall (i.e., no movement). Despite using a classifier as the decoder, arbitrary hand postures are possible with our approach. We analyse a public dataset previously recorded and published by us, comprising measurements from 10 able-bodied and two transradial amputee participants. We demonstrate the feasibility of using our proposed action decoding paradigm to predict movement action for all five digits as well as rotation of the thumb. We perform a systematic offline analysis by investigating the effect of various algorithmic parameters on decoding performance, such as feature selection and choice of classification algorithm and multi-output strategy. The outcomes of the offline analysis presented in this study will be used to inform the real-time implementation of our algorithm. In the future, we will further evaluate its efficacy with real-time control experiments involving upper-limb amputees.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7547112PMC
http://dx.doi.org/10.1038/s41598-020-72574-7DOI Listing

Publication Analysis

Top Keywords

action decoding
12
offline analysis
12
multi-output multi-class
8
multi-class classification
8
movement intent
8
myoelectric digit
4
action
4
digit action
4
decoding
4
decoding multi-output
4

Similar Publications

Background: Empathy is a complex behavior enabling individuals to recognize and sense the emotional situation of others. Empathy requires cognitive, emotional, and learning abilities to understand and react to the suffering of others. The current study evaluates the effect of Amyloid-Beta (Aβ), an aggregated peptide involved in Alzheimer's disease on empathy-like behavior.

View Article and Find Full Text PDF

Animacy perception, the ability to discern living from non-living entities, is crucial for survival and social interaction, as it includes recognizing abstract concepts such as movement, purpose, and intentions. This process involves interpreting cues that may suggest the intentions or actions of others. It engages the temporal cortex (TC), particularly the superior temporal sulcus (STS) and the adjacent region of the inferior temporal cortex (ITC), as well as the dorsomedial prefrontal cortex (dmPFC).

View Article and Find Full Text PDF

Fungal lectins show differential antiproliferative activity against cancer cell lines.

Int J Biol Macromol

December 2024

BioLab, Instituto Universitario de Bio-Orgánica "Antonio González", University of La Laguna, La Laguna, Spain.

Glycosylation patterns represent an important signature of cancer cells that can be decoded by glycan-binding proteins, i.e., lectins.

View Article and Find Full Text PDF

Human motion similarity evaluation based on deep metric learning.

Sci Rep

December 2024

College of Sports, Beihua University, Jilin, 132000, China.

In order to eliminate the impact of camera viewpoint factors and human skeleton differences on the action similarity evaluation and to address the issue of human action similarity evaluation under different viewpoints, a method based on deep metric learning is proposed in this article. The method trains an automatic encoder-decoder deep neural network model by means of a homemade synthetic dataset, which maps the 2D human skeletal key point sequence samples extracted from motion videos into three potential low-dimensional dense spaces. Action feature vectors independent of camera viewpoint and human skeleton structure are extracted in the low-dimensional dense spaces, and motion similarity metrics are performed based on these features, thereby effectively eliminating the effects of camera viewpoint and human skeleton size differences on motion similarity evaluation.

View Article and Find Full Text PDF

Unraveling EEG correlates of unimanual finger movements: insights from non-repetitive flexion and extension tasks.

J Neuroeng Rehabil

December 2024

Laboratory for Neuro- & Psychophysiology, Department of Neurosciences, KU Leuven, Leuven, Belgium.

Background: The loss of finger control in individuals with neuromuscular disorders significantly impacts their quality of life. Electroencephalography (EEG)-based brain-computer interfaces that actuate neuroprostheses directly via decoded motor intentions can help restore lost finger mobility. However, the extent to which finger movements exhibit distinct and decodable EEG correlates remains unresolved.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!