Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1057
Function: getPubMedXML
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3175
Function: GetPubMedArticleOutput_2016
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
Brain-computer interfaces (BCIs) largely augment human capabilities by translating brain wave signals into feasible commands to operate external devices. However, many issues face the development of BCIs such as the low classification accuracy of brain signals and the tedious human-learning procedures. To solve these problems, we propose to use signals associated with eye saccades and blinks to control a BCI interface. By extracting existing physiological eye signals, the user does not need to adapt his/her brain waves to the device. Furthermore, using saccade signals to control an external device frees the limbs to perform other tasks. In this research, we use two electrodes placed on top of the left and right ears of thirteen participants. Then we use Independent Component Analysis (ICA) to extract meaningful EEG signals associated with eye movements. A sliding-window technique was implemented to collect relevant features. Finally, we classified the features as horizontal or blink eye movements using KNN and SVM. We were able to achieve a mean classification accuracy of about 97%. The two electrodes were then integrated with off-the-shelf earbuds to control a wheelchair. The earbuds can generate voice cues to indicate when to rotate the eyeballs to certain locations (i.e., left or right) or blink, so that the user can select directional commands to drive the wheelchair. In addition, through properly designing the contents of voice menus, we can generate as many commands as possible, even though we only have limited numbers of states of the identified eye saccade movements.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/ICORR.2017.8009392 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!