Sports-ACtrans Net: research on multimodal robotic sports action recognition driven via ST-GCN.

Front Neurorobot

Physical Education Institute, Henan Polytechnic University, Jiaozuo, Henan, China.

Published: October 2024

Introduction: Accurately recognizing and understanding human motion actions presents a key challenge in the development of intelligent sports robots. Traditional methods often encounter significant drawbacks, such as high computational resource requirements and suboptimal real-time performance. To address these limitations, this study proposes a novel approach called Sports-ACtrans Net.

Methods: In this approach, the Swin Transformer processes visual data to extract spatial features, while the Spatio-Temporal Graph Convolutional Network (ST-GCN) models human motion as graphs to handle skeleton data. By combining these outputs, a comprehensive representation of motion actions is created. Reinforcement learning is employed to optimize the action recognition process, framing it as a sequential decision-making problem. Deep Q-learning is utilized to learn the optimal policy, thereby enhancing the robot's ability to accurately recognize and engage in motion.

Results And Discussion: Experiments demonstrate significant improvements over state-of-the-art methods. This research advances the fields of neural computation, computer vision, and neuroscience, aiding in the development of intelligent robotic systems capable of understanding and participating in sports activities.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11502397PMC
http://dx.doi.org/10.3389/fnbot.2024.1443432DOI Listing

Publication Analysis

Top Keywords

action recognition
8
human motion
8
motion actions
8
development intelligent
8
sports-actrans net
4
net multimodal
4
multimodal robotic
4
robotic sports
4
sports action
4
recognition driven
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!