Human trajectories can be tracked by the internal processing of a camera as an edge device. This work aims to match peoples' trajectories obtained from cameras to sensor data such as acceleration and angular velocity, obtained from wearable devices. Since human trajectory and sensor data differ in modality, the matching method is not straightforward. Furthermore, complete trajectory information is unavailable; it is difficult to determine which fragments belong to whom. To solve this problem, we newly proposed the to find the similarity between a unit period trajectory and the corresponding sensor data. We also propose a that systematically updates the similarity data and integrates it over time while keeping other trajectories in mind. We confirmed that the proposed method can match human trajectories and sensor data with an accuracy, a sensitivity, and an F1 of 0.725. Our models achieved decent results on the UEA dataset.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11175260 | PMC |
http://dx.doi.org/10.3390/s24113680 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!