Bridging the lab-to-field gap using machine learning: a narrative review.

Sports Biomech

Tech & Policy Lab, The University of Western Australia, Crawley, WA, Australia.

Published: April 2023

This paper summarises recent advancement in applications of machine learning in sports biomechanics to bridge the lab-to-field gap as presented in the Hans Gros Emerging Researcher Award lecture at the annual conference of the International Society of Biomechanics in Sports 2022. One major challenge in machine learning applications is the need for large, high-quality datasets. Currently, most datasets, which contain kinematic and kinetic information, were collected using traditional laboratory-based motion capture despite wearable inertial sensors or standard video cameras being the hardware capable of on-field analysis. For both technologies, no high-quality large-scale databases exist. A second challenge is the lack of guidelines on how to use machine learning in biomechanics, where mostly small datasets collected on a particular population are available. This paper will summarise methods to re-purpose motion capture data for machine learning applications towards on-field motion analysis and give an overview of current applications in an attempt to derive guidelines on the most appropriate algorithm to use, an appropriate dataset size, suitable input data to estimate motion kinematics or kinetics, and how much variability should be in the dataset. This information will allow research to progress towards bridging the lab-to-field gap.

Download full-text PDF

Source
http://dx.doi.org/10.1080/14763141.2023.2200749DOI Listing

Publication Analysis

Top Keywords

machine learning
20
lab-to-field gap
12
bridging lab-to-field
8
learning applications
8
motion capture
8
machine
5
learning
5
gap machine
4
learning narrative
4
narrative review
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!