In cinematic VR applications, haptic feedback can significantly enhance the sense of reality and immersion for users. The increasing availability of emerging haptic devices opens up possibilities for future cinematic VR applications that allow users to receive haptic feedback while they are watching videos. However, automatically rendering haptic cues from real-time video content, particularly from video motion, is a technically challenging task. In this article, we propose a novel framework called "Video2Haptics" that leverages the emerging bio-inspired event camera to capture event signals as a lightweight representation of video motion. We then propose efficient event-based visual processing methods to estimate force or intensity from video motion in the event domain, rather than the pixel domain. To demonstrate the application of Video2Haptics, we convert the estimated force or intensity to dynamic vibrotactile feedback on emerging haptic gloves, synchronized with the corresponding video motion. As a result, Video2Haptics allows users not only to view the video but also to perceive the video motion concurrently. Our experimental results show that the proposed event-based processing methods for force and intensity estimation are one to two orders of magnitude faster than conventional methods. Our user study results confirm that the proposed Video2Haptics framework can considerably enhance the users' video experience.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TVCG.2024.3360468DOI Listing

Publication Analysis

Top Keywords

video motion
24
haptic feedback
12
force intensity
12
video
9
bio-inspired event
8
cinematic applications
8
emerging haptic
8
processing methods
8
motion
6
haptic
6

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!