Representation of Auditory Motion Directions and Sound Source Locations in the Human Planum Temporale.

J Neurosci

Crossmodal Perception and Plasticity Laboratory, Center of Mind/Brain Sciences, University of Trento, I-38122 Trento, Italy, and

Published: March 2019

The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is, however, poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to sounds moving left, right, up, and down as well as static sounds. Whole-brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human planum temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were, however, significantly distinct. Altogether, our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location. Compared with what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human planum temporale (hPT) and that they rely on partially shared pattern geometries. Our study, therefore, sheds important new light on how computing the location or direction of sounds is implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a "preferred axis of motion" organization, reminiscent of the coding mechanisms typically observed in the occipital middle-temporal cortex (hMT+/V5) region for computing visual motion.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6433766PMC
http://dx.doi.org/10.1523/JNEUROSCI.2289-18.2018DOI Listing

Publication Analysis

Top Keywords

auditory motion
16
sound source
16
motion directions
12
source locations
12
human planum
12
planum temporale
12
rely partially
12
partially shared
12
motion
10
directions sound
8

Similar Publications

Music can evoke powerful emotions in listeners. However, the role that instrumental music (music without any vocal part) plays in conveying extra-musical meaning, above and beyond emotions, is still a debated question. We conducted a study wherein participants (N = 121) listened to twenty 15-second-long excerpts of polyphonic instrumental soundtrack music and reported (i) perceived emotions (e.

View Article and Find Full Text PDF

Motion-Onset Visually Evoked Potentials (VEPs) are Amplified in The Deaf.

J Neurophysiol

January 2025

Department of Physiology, McGill University, Montreal, Quebec H3G 1Y6 Canada.

The loss of a sensory modality triggers a phenomenon known as cross-modal plasticity, where areas of the brain responsible for the lost sensory modality are reorganized and repurposed to the benefit of the remaining modalities. After perinatal or congenital deafness, superior visual motion detection abilities have been psychophysically identified in both humans and cats, and this advantage has been causally demonstrated to be mediated by reorganized auditory cortex. In our study, we investigated visually evoked potentials (VEPs) in response to motion-onset stimuli of varying speeds in both hearing and perinatally deafened cats under light anesthesia.

View Article and Find Full Text PDF

Background: Restoring pre-injury normal gait following Anterior Cruciate Ligament Reconstruction (ACLR) is a critical challenge. The purpose of this study was to compare spatiotemporal parameters in athletes following ACL reconstruction with healthy athletes when cognitive load and speed were manipulated.

Methods: Twenty male soccer players with an ACLR history and 20 healthy matched individuals completed walking tasks under four conditions: with and without a cognitive load (auditory Stroop task), and at preferred speed as well as high speed (20% higher than the individual's preferred speed).

View Article and Find Full Text PDF

Former studies have established that individuals with a cochlear implant (CI) for treating single-sided deafness experience improved speech processing after implantation. However, it is not clear how each ear contributes separately to improve speech perception over time at the behavioural and neural level. In this longitudinal EEG study with four different time points, we measured neural activity in response to various temporally and spectrally degraded spoken words presented monaurally to the CI and non-CI ears (5 left and 5 right ears) in 10 single-sided CI users and 10 age- and sex-matched individuals with normal hearing.

View Article and Find Full Text PDF

The role of attention in eliciting a musically induced visual motion aftereffect.

Atten Percept Psychophys

January 2025

Department of Psychology, Huron University College at Western: London, 1349 Western Road, London, ON, N6G 1H3, Canada.

Previous studies have reported visual motion aftereffects (MAEs) following prolonged exposure to auditory stimuli depicting motion, such as ascending or descending musical scales. The role of attention in modulating these cross-modal MAEs, however, remains unclear. The present study manipulated the level of attention directed to musical scales depicting motion and assessed subsequent changes in MAE strength.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!