Virtual and mixed-reality (XR) technology has advanced significantly in the last few years and will enable the future of work, education, socialization, and entertainment. Eye-tracking data is required for supporting novel modes of interaction, animating virtual avatars, and implementing rendering or streaming optimizations. While eye tracking enables many beneficial applications in XR, it also introduces a risk to privacy by enabling re-identification of users. We applied privacy definitions of it-anonymity and plausible deniability (PD) to datasets of eye-tracking samples and evaluated them against the state-of-the-art differential privacy (DP) approach. Two VR datasets were processed to reduce identification rates while minimizing the impact on the performance of trained machine-learning models. Our results suggest that both PD and DP mechanisms produced practical privacy-utility trade-offs with respect to re-identification and activity classification accuracy, while k-anonymity performed best at retaining utility for gaze prediction.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TVCG.2023.3247048 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!