Publications by authors named "Gunar Schirner"

For transradial amputees, robotic prosthetic hands promise to regain the capability to perform daily living activities. Current control methods based on physiological signals such as electromyography (EMG) are prone to yielding poor inference outcomes due to motion artifacts, muscle fatigue, and many more. Vision sensors are a major source of information about the environment state and can play a vital role in inferring feasible and intended gestures.

View Article and Find Full Text PDF

Electromyography (EMG) data has been extensively adopted as an intuitive interface for instructing human-robot collaboration. A major challenge to the real-time detection of human grasp intent is the identification of dynamic EMG from hand movements. Previous studies predominantly implemented the steady-state EMG classification with a small number of grasp patterns in dynamic situations, which are insufficient to generate differentiated control regarding the variation of muscular activity in practice.

View Article and Find Full Text PDF

The electromyography (EMG) signals have been widely utilized in human-robot interaction for extracting user hand/arm motion instructions. A major challenge of the online interaction with robots is the reliable EMG recognition from real-time data. However, previous studies mainly focused on using steady-state EMG signals with a small number of grasp patterns to implement classification algorithms, which is insufficient to generate robust control regarding the dynamic muscular activity variation in practice.

View Article and Find Full Text PDF

Upper limb and hand functionality is critical to many activities of daily living and the amputation of one can lead to significant functionality loss for individuals. From this perspective, advanced prosthetic hands of the future are anticipated to benefit from improved shared control between a robotic hand and its human user, but more importantly from the improved capability to infer human intent from multimodal sensor data to provide the robotic hand perception abilities regarding the operational context. Such multimodal sensor data may include various environment sensors including vision, as well as human physiology and behavior sensors including electromyography and inertial measurement units.

View Article and Find Full Text PDF

New medical procedures promise continuous patient monitoring and drug delivery through implanted sensors and actuators. When over the air wireless radio frequency (OTA-RF) links are used for intra-body implant communication, the network incurs heavy energy costs owing to absorption within the human tissue. With this motivation, we explore an alternate form of intra-body communication that relies on weak electrical signals, instead of OTA-RF.

View Article and Find Full Text PDF