Object manipulation produces characteristic sounds and causes specific haptic sensations that facilitate the recognition of the manipulated object. To identify the neural correlates of audio-haptic binding of object features, healthy volunteers underwent functional magnetic resonance imaging while they matched a target object to a sample object within and across audition and touch. By introducing a delay between the presentation of sample and target stimuli, it was possible to dissociate haptic-to-auditory and auditory-to-haptic matching. We hypothesized that only semantically coherent auditory and haptic object features activate cortical regions that host unified conceptual object representations. The left fusiform gyrus (FG) and posterior superior temporal sulcus (pSTS) showed increased activation during crossmodal matching of semantically congruent but not incongruent object stimuli. In the FG, this effect was found for haptic-to-auditory and auditory-to-haptic matching, whereas the pSTS only displayed a crossmodal matching effect for congruent auditory targets. Auditory and somatosensory association cortices showed increased activity during crossmodal object matching which was, however, independent of semantic congruency. Together, the results show multisensory interactions at different hierarchical stages of auditory and haptic object processing. Object-specific crossmodal interactions culminate in the left FG, which may provide a higher order convergence zone for conceptual object knowledge.

Download full-text PDF

Source
http://dx.doi.org/10.1093/cercor/bhs076DOI Listing

Publication Analysis

Top Keywords

auditory haptic
12
object
12
haptic object
12
multisensory interactions
8
object features
8
haptic-to-auditory auditory-to-haptic
8
auditory-to-haptic matching
8
conceptual object
8
crossmodal matching
8
auditory
5

Similar Publications

Recent research has highlighted a notable confidence bias in the haptic sense, yet its impact on learning relative to other senses remains unexplored. This online study investigated learning behaviour across visual, auditory, and haptic modalities using a probabilistic selection task on computers and mobile devices, employing dynamic and ecologically valid stimuli to enhance generalisability. We analysed reaction time as an indicator of confidence, alongside learning speed and task accuracy.

View Article and Find Full Text PDF

Research into new solutions for wearable assistive devices for the visually impaired is an important area of assistive technology (AT). This plays a crucial role in improving the functionality and independence of the visually impaired, helping them to participate fully in their daily lives and in various community activities. This study presents a bibliometric analysis of the literature published over the last decade on wearable assistive devices for the visually impaired, retrieved from the Web of Science Core Collection (WoSCC) using CiteSpace, to provide an overview of the current state of research, trends, and hotspots in the field.

View Article and Find Full Text PDF

The multisensory control of sequential actions.

Exp Brain Res

December 2024

Department of Medical and Translational Biology, Umeå University, S-901 87, Umeå, Sweden.

Many motor tasks are comprised of sequentially linked action phases, as when reaching for, lifting, transporting, and replacing a cup of coffee. During such tasks, discrete visual, auditory and/or haptic feedback are typically associated with mechanical events at the completion of each action phase, as when breaking and subsequently making contact between the cup and the table. An emerging concept is that important sensorimotor control operations, that affect subsequent action phases, are centred on these discrete multisensory events.

View Article and Find Full Text PDF

Next-generation Robotics in Otology: The HEARO Procedure.

J Craniofac Surg

January 2025

Department of Otorhinolaryngology Head and Neck surgery, University Hospital UZ Brussel, Vrije Universiteit Brussel, Brussels Health Campus, Brussels, Belgium.

Article Synopsis
  • Cochlear implantation is a leading treatment for severe hearing loss, and the study focuses on improving the process through Robotically Assisted Cochlear Implantation Surgery (RACIS) using a method named HEARO for precise inner ear access.
  • A preclinical study using cadavers showed high accuracy in the surgical technique, with a successful insertion rate of 94% and no damage to critical structures like the facial nerve.
  • Future developments for RACIS will enhance the surgical process with features like haptic feedback, automated trajectory planning, and improved imaging technology for better outcomes.
View Article and Find Full Text PDF

To provide deeper immersion for the user in the virtual environments, both force and torque feedbacks are required rather than the mere use of visual and auditory ones. In this paper, we develop a novel propeller-based Ungrounded Handheld Haptic Device (UHHD) that delivers both force and torque feedbacks in one device to help the user experience a realistic sensation of immersion in a three-dimensional (3D) space. The proposed UHHD uses only a pair of propellers and a set of sliders to continuously generate the desired force and torque feedbacks up to 15N and 1N.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!