Objective. Assistive robots can be developed to restore or provide more autonomy for individuals with motor impairments. In particular, power wheelchairs can compensate lower-limb impairments, while robotic manipulators can compensate upper-limbs impairments. Recent studies have shown that Brain-Computer Interfaces (BCI) can be used to operate this type of devices. However, activities of daily living and long-term use in real-life contexts such as home require robustness and adaptability to complex, changing and cluttered environments which can be problematic given the neural signals that do not always allow a safe and efficient use. Approach. This article describes assist-as-needed sensor-based shared control methods relying on the blending of BCI and depth-sensor-based control. The proposed assistance targets the BCI-teleoperation of effectors for tasks that answer mobility and manipulation needs in a at-home context. Main Results. The assistance provided by the proposed methods was evaluated through a wheelchair mobility and reach-and-grasp laboratory-based experiments in a controlled environment, as part of a clinical trial with a quadriplegic patient implanted with a wireless 64-channel ElectroCorticoGram (ECoG) recording implant named WIMAGINE. Results showed that the proposed methods can assist BCI users in both tasks. Indeed, the time to perform the tasks and the number of changes of mental tasks were reduced. Moreover, unwanted actions, such as wheelchair collisions with the environment, and gripper opening that could result in the fall of the object were avoided. Significance. The proposed methods are steps toward at-home use of BCI-teleoperated assistive robots. Indeed, the proposed shared control methods improved the performance of the two assistive devices. Clinical trial, registration number: NCT02550522.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1088/1741-2552/adae36 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!