Aural localization of silent objects by active human biosonar: neural representations of virtual echo-acoustic space.

Eur J Neurosci

Graduate School of Systemic Neuroscience, Ludwig-Maximilians-Universität München, Großhaderner Str. 2, 82152, Planegg-Martinsried, Germany; German Center for Vertigo and Balance Disorders, Ludwig-Maximilians-Universität München, Munich, Germany; Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Großhaderner Str. 2, 82152, Planegg-Martinsried, Germany.

Published: March 2015

AI Article Synopsis

  • Some blind individuals can use tongue clicks to detect and locate objects, showing increased activity in their 'visual' cortex while doing so.
  • A new virtualization technique allowed study participants to perform echolocation tasks in an MRI scanner, using self-generated sounds and analyzing their brain activity.
  • Results indicated that blind echolocation experts had enhanced brain activation in response to sound reflections, while sighted individuals employed different brain areas for the same tasks.

Article Abstract

Some blind humans have developed the remarkable ability to detect and localize objects through the auditory analysis of self-generated tongue clicks. These echolocation experts show a corresponding increase in 'visual' cortex activity when listening to echo-acoustic sounds. Echolocation in real-life settings involves multiple reflections as well as active sound production, neither of which has been systematically addressed. We developed a virtualization technique that allows participants to actively perform such biosonar tasks in virtual echo-acoustic space during magnetic resonance imaging (MRI). Tongue clicks, emitted in the MRI scanner, are picked up by a microphone, convolved in real time with the binaural impulse responses of a virtual space, and presented via headphones as virtual echoes. In this manner, we investigated the brain activity during active echo-acoustic localization tasks. Our data show that, in blind echolocation experts, activations in the calcarine cortex are dramatically enhanced when a single reflector is introduced into otherwise anechoic virtual space. A pattern-classification analysis revealed that, in the blind, calcarine cortex activation patterns could discriminate left-side from right-side reflectors. This was found in both blind experts, but the effect was significant for only one of them. In sighted controls, 'visual' cortex activations were insignificant, but activation patterns in the planum temporale were sufficient to discriminate left-side from right-side reflectors. Our data suggest that blind and echolocation-trained, sighted subjects may recruit different neural substrates for the same active-echolocation task.

Download full-text PDF

Source
http://dx.doi.org/10.1111/ejn.12843DOI Listing

Publication Analysis

Top Keywords

virtual echo-acoustic
8
echo-acoustic space
8
tongue clicks
8
echolocation experts
8
'visual' cortex
8
virtual space
8
data blind
8
calcarine cortex
8
activation patterns
8
discriminate left-side
8

Similar Publications

Navigation and perception of spatial layout in virtual echo-acoustic space.

Cognition

April 2020

Department of Psychology, Durham University, Durham DH1 3LE, UK. Electronic address:

Successful navigation involves finding the way, planning routes, and avoiding collisions. Whilst previous research has shown that people can navigate using non-visual cues, it is not clear to what degree learned non-visual navigational abilities generalise to 'new' environments. Furthermore, the ability to successfully avoid collisions has not been investigated separately from the ability to perceive spatial layout or to orient oneself in space.

View Article and Find Full Text PDF

Flutter sensitivity in FM bats. Part I: delay modulation.

J Comp Physiol A Neuroethol Sens Neural Behav Physiol

November 2018

Department Biology II, Ludwig Maximilians University Munich, Großhaderner Str. 2, 82152, Martinsried, Germany.

Echolocating bats measure target distance by the time delay between call and echo. Target movement such as the flutter of insect wings induces delay modulations. Perception of delay modulations has been studied extensively in bats, but only concerning how well bats discriminate flutter frequencies, never with regard to flutter magnitude.

View Article and Find Full Text PDF

Echolocating bats use echoes of their sonar emissions to determine the position and distance of objects or prey. Target distance is represented as a map of echo delay in the auditory cortex (AC) of bats. During a bat's flight through a natural complex environment, echo streams are reflected from multiple objects along its flight path.

View Article and Find Full Text PDF

Self-motion facilitates echo-acoustic orientation in humans.

R Soc Open Sci

November 2014

Division of Neurobiology, Department Biologie II , Ludwig-Maximilians-Universität München, Großhadernerstr. 2, 82152 Planegg-Martinsried, Germany ; Graduate School of Systemic Neurosciences , Ludwig-Maximilians-Universität München, Großhadernerstr. 2, 82152 Planegg-Martinsried, Germany.

The ability of blind humans to navigate complex environments through echolocation has received rapidly increasing scientific interest. However, technical limitations have precluded a formal quantification of the interplay between echolocation and self-motion. Here, we use a novel virtual echo-acoustic space technique to formally quantify the influence of self-motion on echo-acoustic orientation.

View Article and Find Full Text PDF

Aural localization of silent objects by active human biosonar: neural representations of virtual echo-acoustic space.

Eur J Neurosci

March 2015

Graduate School of Systemic Neuroscience, Ludwig-Maximilians-Universität München, Großhaderner Str. 2, 82152, Planegg-Martinsried, Germany; German Center for Vertigo and Balance Disorders, Ludwig-Maximilians-Universität München, Munich, Germany; Division of Neurobiology, Department Biologie II, Ludwig-Maximilians-Universität München, Großhaderner Str. 2, 82152, Planegg-Martinsried, Germany.

Article Synopsis
  • Some blind individuals can use tongue clicks to detect and locate objects, showing increased activity in their 'visual' cortex while doing so.
  • A new virtualization technique allowed study participants to perform echolocation tasks in an MRI scanner, using self-generated sounds and analyzing their brain activity.
  • Results indicated that blind echolocation experts had enhanced brain activation in response to sound reflections, while sighted individuals employed different brain areas for the same tasks.
View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!