Personally curated content in short-form video formats provides added value for participants and spectators but is often disregarded in lower-level events because it is too labor-intensive to create or is not recorded at all. Our smart sensor-driven tripod focuses on supplying a unified sensor and video solution to capture personalized highlights for participants in various sporting events with low computational and hardware costs. The relevant parts of the video for each participant are automatically determined by using the timestamps of his/her received sensor data. This is achieved through a customizable clipping mechanism that processes and optimizes both video and sensor data. The clipping mechanism is driven by sensing nearby signals of Adaptive Network Topology (ANT+) capable devices worn by the athletes that provide both locality information and identification. The device was deployed and tested in an amateur-level cycling race in which it provided clips with a detection rate of 92.9%. The associated sensor data were used to automatically extract peloton passages and report riders' positions on the course, as well as which participants were grouped together. Insights derived from sensor signals can be processed and published in real time, and an upload optimization scheme is proposed that can provide video clips for each rider a maximum of 5 min after the passage if video upload is enabled.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10857372PMC
http://dx.doi.org/10.3390/s24030736DOI Listing

Publication Analysis

Top Keywords

sensor data
12
sporting events
8
clipping mechanism
8
video
6
sensor
5
fully automatic
4
automatic camera
4
camera personalized
4
personalized highlight
4
highlight generation
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!