Severe motor impairments can affect the ability to communicate. The ability to see has a decisive influence on the augmentative and alternative communication (AAC) systems available to the user. To better understand the initial impressions users have of AAC systems we asked naïve healthy participants to compare two visual (a visual P300 brain-computer interface (BCI) and an eye-tracker) and two non-visual systems (an auditory and a tactile P300 BCI). Eleven healthy participants performed 20 selections in a five choice task with each system. The visual P300 BCI used face stimuli, the auditory P300 BCI used Japanese Hiragana syllables and the tactile P300 BCI used a stimulator on the small left finger, middle left finger, right thumb, middle right finger and small right finger. The eye-tracker required a dwell time of 3 s on the target for selection. We calculated accuracies and information-transfer rates (ITRs) for each control method using the selection time that yielded the highest ITR and an accuracy above 70% for each system. Accuracies of 88% were achieved with the visual P300 BCI (4.8 s selection time, 20.9 bits/min), of 70% with the auditory BCI (19.9 s, 3.3 bits/min), of 71% with the tactile BCI (18 s, 3.4 bits/min) and of 100% with the eye-tracker (5.1 s, 28.2 bits/min). Performance between eye-tracker and visual BCI correlated strongly, correlation between tactile and auditory BCI performance was lower. Our data showed no advantage for either non-visual system in terms of ITR but a lower correlation of performance which suggests that choosing the system which suits a particular user is of higher importance for non-visual systems than visual systems.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5997833 | PMC |
http://dx.doi.org/10.3389/fnhum.2018.00228 | DOI Listing |
J Neural Eng
January 2025
Department of Biomedical Engineering, The University of Melbourne, Parkville, Melbourne, Victoria, 3010, AUSTRALIA.
Multiple Sclerosis (MS) is a heterogeneous autoimmune-mediated disorder affecting the central nervous system, commonly manifesting as fatigue and progressive limb impairment. This can significantly impact quality of life due to weakness or paralysis in the upper and lower limbs. A Brain-Computer Interface (BCI) aims to restore quality of life through control of an external device, such as a wheelchair.
View Article and Find Full Text PDFJ Biomed Phys Eng
December 2024
Medical Image and Signal Processing Research Center, School of Advanced Technologies in Medicine, Isfahan University of Medical Sciences, Isfahan, Iran.
Background: The P300 signal, an endogenous component of event-related potentials, is extracted from an electroencephalography signal and employed in Brain-computer Interface (BCI) devices.
Objective: The current study aimed to address challenges in extracting useful features from P300 components and detecting P300 through a hybrid unsupervised manner based on Convolutional Neural Network (CNN) and Long Short-term Memory (LSTM).
Material And Methods: In this cross-sectional study, CNN as a useful method for the P300 classification task emphasizes spatial characteristics of data.
J Neural Eng
December 2024
Ulsan National Institute of Science and Technology, 50, UNIST-gil, Eonyang-eup, Ulju-gun, Ulsan, Republic of Korea, Ulsan, 44919, Korea (the Republic of).
Objective: In the pursuit of refining P300-based brain-computer interfaces (BCIs), our research aims to propose a novel stimulus design focused on selective attention and task relevance to address the challenges of P300-based BCIs, including the necessity of repetitive stimulus presentations, accuracy improvement, user variability, and calibration demands.
Approach: In the oddball task for P300-based BCIs, we develop a stimulus design involving task-relevant dynamic stimuli implemented as finger-tapping to enhance the elicitation and consistency of event-related potentials (ERPs). We further improve the performance of P300-based BCIs by optimizing ERP feature extraction and classification in offline analyses.
Front Hum Neurosci
November 2024
Graduate School of Engineering and Science, Shibaura Institute of Technology, Tokyo, Japan.
Introduction: The ASME (stands for Auditory Stream segregation Multiclass ERP) paradigm is proposed and used for an auditory brain-computer interface (BCI). In this paradigm, a sequence of sounds that are perceived as multiple auditory streams are presented simultaneously, and each stream is an oddball sequence. The users are requested to focus selectively on deviant stimuli in one of the streams, and the target of the user attention is detected by decoding event-related potentials (ERPs).
View Article and Find Full Text PDFAssist Technol
October 2024
Department of Special Education and Communication Disorders, University of Nebraska-Lincoln, Lincoln, Nebraska, USA.
Augmentative and alternative communication (AAC) supports offer communication aids for individuals with severe speech and physical impairments. This study presents the development and proof of concept for an iPad application designed to evaluate the design preferences of both adults and children for AAC scanning and emerging P300-brain-computer interface access to AAC (BCI-AAC), both of which utilize item highlighting. Developed through a multidisciplinary and iterative process, the application incorporates customizable highlighting methods and display options for spelling-based and pictorial symbol interfaces.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!