Decoding micro-electrocorticographic signals by using explainable 3D convolutional neural network to predict finger movements.

J Neurosci Methods

Department of Neurological Surgery, University of Washington, Seattle, WA, USA; Center for Neurotechnology, University of Washington, Seattle, WA, USA; Departments of Surgery, Seattle Children's Hospital, Seattle, WA, USA.

Published: November 2024

Background: Electroencephalography (EEG) and electrocorticography (ECoG) recordings have been used to decode finger movements by analyzing brain activity. Traditional methods focused on single bandpass power changes for movement decoding, utilizing machine learning models requiring manual feature extraction.

New Method: This study introduces a 3D convolutional neural network (3D-CNN) model to decode finger movements using ECoG data. The model employs adaptive, explainable AI (xAI) techniques to interpret the physiological relevance of brain signals. ECoG signals from epilepsy patients during awake craniotomy were processed to extract power spectral density across multiple frequency bands. These data formed a 3D matrix used to train the 3D-CNN to predict finger trajectories.

Results: The 3D-CNN model showed significant accuracy in predicting finger movements, with root-mean-square error (RMSE) values of 0.26-0.38 for single finger movements and 0.20-0.24 for combined movements. Explainable AI techniques, Grad-CAM and SHAP, identified the high gamma (HG) band as crucial for movement prediction, showing specific cortical regions involved in different finger movements. These findings highlighted the physiological significance of the HG band in motor control.

Comparison With Existing Methods: The 3D-CNN model outperformed traditional machine learning approaches by effectively capturing spatial and temporal patterns in ECoG data. The use of xAI techniques provided clearer insights into the model's decision-making process, unlike the "black box" nature of standard deep learning models.

Conclusions: The proposed 3D-CNN model, combined with xAI methods, enhances the decoding accuracy of finger movements from ECoG data. This approach offers a more efficient and interpretable solution for brain-computer interface (BCI) applications, emphasizing the HG band's role in motor control.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jneumeth.2024.110251DOI Listing

Publication Analysis

Top Keywords

finger movements
28
3d-cnn model
16
ecog data
12
convolutional neural
8
neural network
8
finger
8
predict finger
8
movements
8
decode finger
8
machine learning
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!