Multi-Level Attention Recognition of EEG Based on Feature Selection.

Int J Environ Res Public Health

School of Communication and Information Engineering, Nanjing University of Posts and Telecommunications, No. 66, XinMofan Road, Gulou District, Nanjing 210003, China.

Published: February 2023

In view of the fact that current attention-recognition studies are mostly single-level-based, this paper proposes a multi-level attention-recognition method based on feature selection. Four experimental scenarios are designed to induce high, medium, low, and non-externally directed attention states. A total of 10 features are extracted from 10 electroencephalogram (EEG) channels, respectively, including time-domain measurements, sample entropy, and frequency band energy ratios. Based on all extracted features, an 88.7% recognition accuracy is achieved when classifying the four different attention states using the support vector machine (SVM) classifier. Afterwards, the sequence-forward-selection method is employed to select the optimal feature subset with high discriminating power from the original feature set. Experimental results show that the classification accuracy can be improved to 94.1% using the filtered feature subsets. In addition, the average recognition accuracy based on single subject classification is improved from 90.03% to 92.00%. The promising results indicate the effectiveness of feature selection in improving the performance of multi-level attention-recognition tasks.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9958593PMC
http://dx.doi.org/10.3390/ijerph20043487DOI Listing

Publication Analysis

Top Keywords

feature selection
12
based feature
8
multi-level attention-recognition
8
attention states
8
recognition accuracy
8
feature
6
multi-level attention
4
attention recognition
4
recognition eeg
4
based
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!