Developing natural, intuitive, and human-centric input systems for mobile human-machine interaction (HMI) poses significant challenges. Existing gaze or gesture-based interaction systems are often constrained by their dependence on continuous visual engagement, limited interaction surfaces, or cumbersome hardware. To address these challenges, we propose MetaSkin, a novel neurohaptic interface that uniquely integrates neural signals with on-skin interaction for bare-handed, eyes-free interaction by exploiting human's natural proprioceptive capabilities. To support the interface, we developed a deep learning framework that employs multiscale temporal-spectral feature representation and selective feature attention to effectively decode neural signals generated by on-skin touch and motion gestures. In experiments with 12 participants, our method achieved offline accuracies of 81.95% for touch location discrimination, 71.00% for motion type identification, and 46.08% for 10-class touch-motion classification. In pseudo-online settings, accuracies reached 99.43% for touch onset detection, and 80.34% and 67.02% for classification of touch location and motion type, respectively. Neurophysiological analyses revealed distinct neural activation patterns in the sensorimotor cortex, underscoring the efficacy of our multiscale approach in capturing rich temporal and spectral dynamics. Future work will focus on optimizing the system for diverse user populations and dynamic environments, with a long-term goal of advancing human-centered, neuroadaptive interfaces for next-generation HMI systems. This work represents a significant step toward a paradigm shift in design of brain-computer interfaces, bridging sensory and motor paradigms for building more sophisticated systems.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TCYB.2025.3533088DOI Listing

Publication Analysis

Top Keywords

human-machine interaction
8
neural signals
8
touch location
8
motion type
8
interaction
6
developing brain-based
4
brain-based bare-handed
4
bare-handed human-machine
4
interaction on-skin
4
on-skin input
4

Similar Publications

The keyboard, a staple tool for information entry and human-machine interaction, faces demands for enhanced information security due to evolving internet technologies. This study introduces a self-powered flexible intelligent keyboard (SFIK) that harnesses the giant magnetoelastic effect to convert the mechanical pressure from key presses into electrical signals. The sensor boasts a wide sensing range (35 to 600 kPa) and a rapid response time (∼300 ms), allowing it to record and recognize individual keystroke dynamics.

View Article and Find Full Text PDF

Steady state visually evoked potential (SSVEP)-based brain-computer interfaces (BCIs), which are widely used in rehabilitation and disability assistance, can benefit from real-time emotion recognition to enhance human-machine interaction. However, the learned discriminative latent representations in SSVEP-BCIs may generalize in an unintended direction, which can lead to reduced accuracy in detecting emotional states. In this paper, we introduce a Valence-Arousal Disentangled Representation Learning (VADL) method, drawing inspiration from the classical two-dimensional emotional model, to enhance the performance and generalization of emotion recognition within SSVEP-BCIs.

View Article and Find Full Text PDF

Aim: Virtual reality (VR) can be analgesic through intercortical modulation. This study investigated neural activities and correlates during different interactive modes.

Methods: Fifteen healthy participants (4M, 11F, age 21.

View Article and Find Full Text PDF

Machine learning-assisted wearable sensing systems for speech recognition and interaction.

Nat Commun

March 2025

Key Laboratory of Optoelectronic Technology & Systems of Ministry of Education, International R & D Center of Micro-nano Systems and New Materials Technology, Chongqing University, Chongqing, 400044, China.

The human voice stands out for its rich information transmission capabilities. However, voice communication is susceptible to interference from noisy environments and obstacles. Here, we propose a wearable wireless flexible skin-attached acoustic sensor (SAAS) capable of capturing the vibrations of vocal organs and skin movements, thereby enabling voice recognition and human-machine interaction (HMI) in harsh acoustic environments.

View Article and Find Full Text PDF

DPHC from Alpinia officinarum Hance specifically modulates the function of CENPU in the cell cycle and apoptosis to ameliorate hepatocellular carcinoma.

J Ethnopharmacol

March 2025

Hepatobiliary and Liver Transplantation Department of Hainan Digestive Disease Center, The Second Affiliated Hospital of Hainan Medical University, Haikou, Hainan, 570311, China; Key Laboratory of Emergency and Trauma of Ministry of Education, The First Affiliated Hospital of Hainan Medical University, Haikou, Hainan, China. Electronic address:

Ethnopharmacological Relevance: Alpinia officinarum Hance (A. officinarum), a perennial herb used in the treatment of digestive system cancers, holds significant value for the Li people of Hainan as a traditional Chinese medicine. (R)-5-hydroxy-1,7-diphenyl-3-heptanone (DPHC), a diarylheptanoid component is derived from A.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!