A surface texture is perceived through both the sound and vibrations produced while being explored by our fingers. Because of their common origin, both modalities have a strong influence on each other, particularly at above 60 Hz for which vibrotactile perception and pitch perception share common neural processes. However, whether the sensation of rhythm is shared between audio and haptic perception is still an open question. In this study, we show striking similarities between the audio and haptic perception of rhythmic changes, and demonstrate the interaction of both modalities below 60 Hz. Using a new surface-haptic device to synthesize arbitrary audio-haptic textures, psychophysical experiments demonstrate that the perception threshold curves of audio and haptic rhythmic gradients are the same. Moreover, multimodal integration occurs when audio and haptic rhythmic gradients are congruent. We propose a multimodal model of rhythm perception to explain these observations. These findings suggest that audio and haptic signals are likely to be processed by common neural mechanisms also for the perception of rhythm. They provide a framework for audio-haptic stimulus generation that is beneficial for nonverbal communication or modern human-machine interfaces.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8907191 | PMC |
http://dx.doi.org/10.1038/s41598-022-08152-w | DOI Listing |
JMIR Rehabil Assist Technol
November 2024
Department of Biomedical Engineering, New York University Tandon School of Engineering, Brooklyn, New York, NY, United States.
Background: Visual disability is a growing problem for many middle-aged and older adults. Conventional mobility aids, such as white canes and guide dogs, have notable limitations that have led to increasing interest in electronic travel aids (ETAs). Despite remarkable progress, current ETAs lack empirical evidence and realistic testing environments and often focus on the substitution or augmentation of a single sense.
View Article and Find Full Text PDFNeuroscience
December 2024
Unit for visually impaired (UVIP), Italian Institute of Technology, Genova, Italy.
Previous research has shown that visual impairment results in reduced audio, tactile and proprioceptive ability. One hypothesis is that these issues arise from inaccurate body representations. Few studies have investigated metric body representations in a visually impaired population.
View Article and Find Full Text PDFSci Rep
September 2024
School of Electrical Engineering and Computer Science, The University of Queensland, Brisbane, QLD, 4072, Australia.
In Virtual Reality (VR), a higher level of presence positively influences the experience and engagement of a user. There are several parameters that are responsible for generating different levels of presence in VR, including but not limited to, graphical fidelity, multi-sensory stimuli, and embodiment. However, standard methods of measuring presence, including self-reported questionnaires, are biased.
View Article and Find Full Text PDFIEEE Trans Vis Comput Graph
September 2024
The concept of an intelligent augmented reality (AR) assistant has significant, wide-ranging applications, with potential uses in medicine, military, and mechanics domains. Such an assistant must be able to perceive the environment and actions, reason about the environment state in relation to a given task, and seamlessly interact with the task performer. These interactions typically involve an AR headset equipped with sensors which capture video, audio, and haptic feedback.
View Article and Find Full Text PDFPLoS One
September 2024
Chair of Acoustics and Haptics, Technische Universität Dresden, Dresden, Germany.
The study of the perceived affective qualities (PAQs) in soundscape assessments have increased in recent years, with methods varying from in-situ to laboratory. Through technological advances, virtual reality (VR) has facilitated evaluations of multiple locations in the same experiment. In this paper, VR reproductions of different urban sites were presented in an online and laboratory environment testing three locations in Greater Manchester ('Park', 'Plaza', and pedestrian 'Street') in two population densities (empty and busy) using ISO/TS 12913-2 (2018) soundscape PAQs.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!