Tactual exploration of objects produce specific patterns in the human brain and hence objects can be recognized by analyzing brain signals during tactile exploration. The present work aims at analyzing EEG signals online for recognition of embossed texts by tactual exploration. EEG signals are acquired from the parietal region over the somatosensory cortex of blindfolded healthy subjects while they tactually explored embossed texts, including symbols, numbers, and alphabets. Classifiers based on the principle of supervised learning are trained on the extracted EEG feature space, comprising three features, namely, adaptive autoregressive parameters, Hurst exponents, and power spectral density, to recognize the respective texts. The pre-trained classifiers are used to classify the EEG data to identify the texts online and the recognized text is displayed on the computer screen for communication. Online classifications of two, four, and six classes of embossed texts are achieved with overall average recognition rates of 76.62, 72.31, and 67.62% respectively and the computational time is less than 2 s in each case. The maximum information transfer rate and utility of the system performance over all experiments are 0.7187 and 2.0529 bits/s respectively. This work presents a study that shows the possibility to classify 3D letters using tactually evoked EEG. In future, it will help the BCI community to design stimuli for better tactile augmentation n also opens new directions of research to facilitate 3D letters for visually impaired persons. Further, 3D maps can be generated for aiding tactual BCI in teleoperation.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5670090 | PMC |
http://dx.doi.org/10.1007/s11571-017-9452-2 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!