Coding of vocalizations by single neurons in ventrolateral prefrontal cortex.

Hear Res

Dept. Neurobiology & Anatomy, Univ. of Rochester, Box 603, Rochester, NY 14642, USA.

Published: November 2013

AI Article Synopsis

  • Neurons in the ventrolateral prefrontal cortex (VLPFC) of non-human primates respond to specific vocalizations, indicating a link between vocal and visual information.
  • Previous studies highlighted that about 19% of VLPFC neurons specifically encode the type of vocalization rather than the identity of the caller, and classification performance for vocal recognition was around 42%.
  • The study suggests that combining vocalizations with facial cues may enhance the VLPFC's ability to process communication calls, as behavioral evidence shows better recognition and memory when both types of stimuli are presented together.

Article Abstract

Neuronal activity in single prefrontal neurons has been correlated with behavioral responses, rules, task variables and stimulus features. In the non-human primate, neurons recorded in ventrolateral prefrontal cortex (VLPFC) have been found to respond to species-specific vocalizations. Previous studies have found multisensory neurons which respond to simultaneously presented faces and vocalizations in this region. Behavioral data suggests that face and vocal information are inextricably linked in animals and humans and therefore may also be tightly linked in the coding of communication calls in prefrontal neurons. In this study we therefore examined the role of VLPFC in encoding vocalization call type information. Specifically, we examined previously recorded single unit responses from the VLPFC in awake, behaving rhesus macaques in response to 3 types of species-specific vocalizations made by 3 individual callers. Analysis of responses by vocalization call type and caller identity showed that ∼19% of cells had a main effect of call type with fewer cells encoding caller. Classification performance of VLPFC neurons was ∼42% averaged across the population. When assessed at discrete time bins, classification performance reached 70 percent for coos in the first 300 ms and remained above chance for the duration of the response period, though performance was lower for other call types. In light of the sub-optimal classification performance of the majority of VLPFC neurons when only vocal information is present, and the recent evidence that most VLPFC neurons are multisensory, the potential enhancement of classification with the addition of accompanying face information is discussed and additional studies recommended. Behavioral and neuronal evidence has shown a considerable benefit in recognition and memory performance when faces and voices are presented simultaneously. In the natural environment both facial and vocalization information is present simultaneously and neural systems no doubt evolved to integrate multisensory stimuli during recognition. This article is part of a Special Issue entitled "Communication Sounds and the Brain: New Directions and Perspectives".

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3979279PMC
http://dx.doi.org/10.1016/j.heares.2013.07.011DOI Listing

Publication Analysis

Top Keywords

call type
12
classification performance
12
vlpfc neurons
12
neurons
8
ventrolateral prefrontal
8
prefrontal cortex
8
prefrontal neurons
8
species-specific vocalizations
8
vocalization call
8
vlpfc
6

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!