Trait inferences from first impressions are drawn rapidly and spontaneously. However, the Covid-19 pandemic forced interactions online introducing differential influential factors on first impressions. As such, there is an absence of research investigating video background on videoconferencing impression formation.
View Article and Find Full Text PDFPrevious work has shown that different sensory channels are prioritized across the life course, with children preferentially responding to auditory information. The aim of the current study was to investigate whether the mechanism that drives this auditory dominance in children occurs at the level of encoding (overshadowing) or when the information is integrated to form a response (response competition). Given that response competition is dependent on a modality integration attempt, a combination of stimuli that could not be integrated was used so that if children's auditory dominance persisted, this would provide evidence for the overshadowing over the response competition mechanism.
View Article and Find Full Text PDFThe rise of the novel COVID-19 virus has made face masks commonplace items around the globe. Recent research found that face masks significantly impair emotion recognition on isolated faces. However, faces are rarely seen in isolation and the body is also a key cue for emotional portrayal.
View Article and Find Full Text PDFEffective emotion recognition is imperative to successfully navigating social situations. Research suggests differing developmental trajectories for the recognition of bodily and vocal emotion, but emotions are usually studied in isolation and rarely considered as multimodal stimuli in the literature. When adults are presented with basic multimodal sensory stimuli, the Colavita effect suggests that they have a visual dominance, whereas more recent research finds that an auditory sensory dominance may be present in children under 8 years of age.
View Article and Find Full Text PDFRecent models of emotion recognition suggest that when people perceive an emotional expression, they partially activate the respective emotion in themselves, providing a basis for the recognition of that emotion. Much of the focus of these models and of their evidential basis has been on sensorimotor simulation as a basis for facial expression recognition - the idea, in short, that coming to know what another feels involves simulating in your brain the motor plans and associated sensory representations engaged by the other person's brain in producing the facial expression that you see. In this review article, we argue that simulation accounts of emotion recognition would benefit from three key extensions.
View Article and Find Full Text PDFFunctional localizers allow the definition of regions of interest in the human brain that cannot be delineated by anatomical markers alone. To date, when localizing the body-selective areas of the visual cortex using fMRI, researchers have used static images of bodies and objects. However, there are other relevant brain areas involved in the processing of moving bodies and action interpretation that are missed by these techniques.
View Article and Find Full Text PDFEmotions are strongly conveyed by the human body and the ability to recognize emotions from body posture or movement is still developing through childhood and adolescence. To date, very few studies have explored how these behavioural observations are paralleled by functional brain development. Furthermore, currently no studies have explored the development of emotion modulation in these areas.
View Article and Find Full Text PDFConverging evidence demonstrates that emotion processing from facial expressions continues to improve throughout childhood and part of adolescence. Here we investigated whether this is also the case for emotions conveyed by non-linguistic vocal expressions, another key aspect of social interactions. We tested 225 children and adolescents (age 5-17) and 30 adults in a forced-choice labeling task using vocal bursts expressing four basic emotions (anger, fear, happiness and sadness).
View Article and Find Full Text PDFOur ability to read other people's non-verbal signals gets refined throughout childhood and adolescence. How this is paralleled by brain development has been investigated mainly with regards to face perception, showing a protracted functional development of the face-selective visual cortical areas. In view of the importance of whole-body expressions in interpersonal communication it is important to understand the development of brain areas sensitive to these social signals.
View Article and Find Full Text PDF