Footstep recognition is an emerging biometric that identifies or verifies users based on footstep pressure patterns obtained while walking. However, the impact of covariates on footstep recordings is not well understood, unlike more established biometric traits such as fingerprint and facial recognition. Therefore, this study used unsupervised hierarchical clustering (HCA) to examine the internal and external covariate influence on spatial and temporal footstep features of twenty individuals. Using 22 cluster validity indices, a robust HCA technique identified two distinct clusters in spatial representations (i.e., peak pressure images) and temporal representations (i.e., ground reaction force (GRF) and center of pressure (COP) time series) of the gait patterns. The clusters determined in both feature domains were distinguishable by body weight, age, race, and shoe type. Interestingly, trends related to sex and walking speed existed only in the temporal domain. These findings suggest dual implications for footstep biometric systems, which may leverage covariate information as soft biometrics to improve user recognition, or require mitigation to limit model bias and improve generalization to new users and conditions.

Download full-text PDF

Source
http://dx.doi.org/10.1109/EMBC53108.2024.10781859DOI Listing

Publication Analysis

Top Keywords

footstep recognition
8
unsupervised hierarchical
8
hierarchical clustering
8
footstep
6
covariate analysis
4
analysis footstep
4
recognition
4
recognition unsupervised
4
clustering footstep
4
recognition emerging
4

Similar Publications

Footstep recognition is an emerging biometric that identifies or verifies users based on footstep pressure patterns obtained while walking. However, the impact of covariates on footstep recordings is not well understood, unlike more established biometric traits such as fingerprint and facial recognition. Therefore, this study used unsupervised hierarchical clustering (HCA) to examine the internal and external covariate influence on spatial and temporal footstep features of twenty individuals.

View Article and Find Full Text PDF

Pressure recordings of footsteps during walking can offer a convenient biometric recognition method for applications in security, forensic analysis, and health monitoring. However, footsteps can exhibit high variability due to a complex interplay of internal and external factors, posing a challenge for recognition systems. To address this issue, this study employed generative adversarial networks with a second discriminator and triplet loss to extract features from high-resolution foot pressure images.

View Article and Find Full Text PDF

Dr. Carolyn Meltzer is an extraordinary radiologist, researcher, mentor, and distinguished leader who deserves recognition for her immense impact on the discipline of radiology. This article serves to acknowledge and celebrate Dr.

View Article and Find Full Text PDF
Article Synopsis
  • * Results showed that native English speakers (L1) demonstrated stronger priming effects with both transparent and opaque compound words compared to a control, while bilinguals (L2) did not show significant differences across different types of compounds.
  • * The research suggests that native speakers are more attuned to the morphological structure of words in early processing phases, while bilinguals, especially those less proficient in English, tend to analyze compounds based on their form rather than meaning.
View Article and Find Full Text PDF

A footstep detection and recognition method based on distributed optical fiber sensor and double-YOLO method is proposed. The sound of footsteps is detected by a phase-sensitive optical time-domain reflectometry (Φ-OTDR) and the footsteps are located and identified by double-YOLO method. The Φ-OTDR can cover a much larger sensing range than traditional sensors.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!