Seeing articulatory movements influences perception of auditory speech. This is often reflected in a shortened latency of auditory event-related potentials (ERPs) generated in the auditory cortex. The present study addressed whether this early neural correlate of audiovisual interaction is modulated by attention.
View Article and Find Full Text PDFFMRI was performed with the dynamic facial expressions fear and happiness. This was done to detect differences in valence processing between 25 subjects with autism spectrum disorders (ASDs) and 27 typically developing controls. Valence scaling was abnormal in ASDs.
View Article and Find Full Text PDFTo take a step towards real-life-like experimental setups, we simultaneously recorded magnetoencephalographic (MEG) signals and subject's gaze direction during audiovisual speech perception. The stimuli were utterances of /apa/ dubbed onto two side-by-side female faces articulating /apa/ (congruent) and /aka/ (incongruent) in synchrony, repeated once every 3 s. Subjects (N = 10) were free to decide which face they viewed, and responses were averaged to two categories according to the gaze direction.
View Article and Find Full Text PDFWe suggest a united concept of consciousness and emotion, based on the systemic cognitive neuroscience perspective regarding organisms as active and goal-directed. We criticize the idea that consciousness and emotion are psychological phenomena having quite different neurophysiological mechanisms. We argue that both characterize a unified systemic organization of behavior, but at different levels.
View Article and Find Full Text PDF