Evidence for a supra-modal representation of emotion from cross-modal adaptation.

Cognition

School of Psychology, Bangor University, Gwynedd LL57 2AS, UK. Electronic address:

Published: January 2015

Successful social interaction hinges on accurate perception of emotional signals. These signals are typically conveyed multi-modally by the face and voice. Previous research has demonstrated uni-modal contrastive aftereffects for emotionally expressive faces or voices. Here we were interested in whether these aftereffects transfer across modality as theoretical models predict. We show that adaptation to facial expressions elicits significant auditory aftereffects. Adaptation to angry facial expressions caused ambiguous vocal stimuli drawn from an anger-fear morphed continuum to be perceived as less angry and more fearful relative to adaptation to fearful faces. In a second experiment, we demonstrate that these aftereffects are not dependent on learned face-voice congruence, i.e. adaptation to one facial identity transferred to an unmatched voice identity. Taken together, our findings provide support for a supra-modal representation of emotion and suggest further that identity and emotion may be processed independently from one another, at least at the supra-modal level of the processing hierarchy.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cognition.2014.11.001DOI Listing

Publication Analysis

Top Keywords

supra-modal representation
8
representation emotion
8
adaptation facial
8
facial expressions
8
adaptation
5
evidence supra-modal
4
emotion cross-modal
4
cross-modal adaptation
4
adaptation successful
4
successful social
4

Similar Publications

Growing evidence suggests that conceptual knowledge influences emotion perception, yet the neural mechanisms underlying this effect are not fully understood. Recent studies have shown that brain representations of facial emotion categories in visual-perceptual areas are predicted by conceptual knowledge, but it remains to be seen if auditory regions are similarly affected. Moreover, it is not fully clear whether these conceptual influences operate at a modality-independent level.

View Article and Find Full Text PDF

Spontaneous supra-modal encoding of number in the infant brain.

Curr Biol

May 2023

Cognitive Neuroimaging Unit U992, Institut National de la Santé et de la Recherche Médicale, Commissariat à l'Énergie Atomique et aux Énergies Alternatives, Direction de la Recherche Fondamentale/Institut Joliot, Centre National de la Recherche Scientifique ERL9003, NeuroSpin Center, Université Paris-Saclay, 91191 Gif-sur-Yvette, France.

The core knowledge hypothesis postulates that infants automatically analyze their environment along abstract dimensions, including numbers. According to this view, approximate numbers should be encoded quickly, pre-attentively, and in a supra-modal manner by the infant brain. Here, we directly tested this idea by submitting the neural responses of sleeping 3-month-old infants, measured with high-density electroencephalography (EEG), to decoders designed to disentangle numerical and non-numerical information.

View Article and Find Full Text PDF

To achieve visual space constancy, our brain remaps eye-centered projections of visual objects across saccades. Here, we measured saccade trajectory curvature following the presentation of visual, auditory, and audiovisual distractors in a double-step saccade task to investigate if this stability mechanism also accounts for localized sounds. We found that saccade trajectories systematically curved away from the position at which either a light or a sound was presented, suggesting that both modalities are represented in eye-centered oculomotor centers.

View Article and Find Full Text PDF

Spatial attention and spatial representation of time are strictly linked in the human brain. In young adults, a leftward shift of spatial attention by prismatic adaptation (PA), is associated with an underestimation whereas a rightward shift is associated with an overestimation of time both for visual and auditory stimuli. These results suggest a supra-modal representation of time left-to-right oriented that is modulated by a bilateral attentional shift.

View Article and Find Full Text PDF

The distinction between nouns and verbs is a language universal. Yet, functional neuroimaging studies comparing noun and verb processing have yielded inconsistent findings, ranging from a complete frontal(verb)-temporal(noun) dichotomy to a complete overlap in activation patterns. The current study addressed the debate about neural distinctions between nouns and verbs by conducting an activation likelihood estimation (ALE) meta-analysis of probabilistic cytoarchitectonic maps.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!