This study investigated emotional reactions to cybersecurity breaches. Based on prior research, a context-specific instrument was developed. This new instrument covered all five emotion components identified by the componential emotion approach.
View Article and Find Full Text PDFThe presence of a change in a visual scene can influence brain activity and behavior, even in the absence of full conscious report. It may be possible for us to that such a change has occurred, even if we cannot specify exactly where or what it was. Despite existing evidence from electroencephalogram (EEG) and eye-tracking data, it is still unclear how this partial level of awareness relates to functional magnetic resonance imaging (fMRI) blood oxygen level dependent (BOLD) activation.
View Article and Find Full Text PDFBackground: With the ever-expanding interconnectedness of the internet and especially with the recent development of the Internet of Things, people are increasingly at risk for cybersecurity breaches that can have far-reaching consequences for their personal and professional lives, with psychological and mental health ramifications.
Objective: We aimed to identify the dimensional structure of emotion processes triggered by one of the most emblematic scenarios of cybersecurity breach, the hacking of one's smart security camera, and explore which personality characteristics systematically relate to these emotion dimensions.
Methods: A total of 902 participants from the United Kingdom and the Netherlands reported their emotion processes triggered by a cybersecurity breach scenario.
Previous studies of change blindness have suggested a distinction between detection and localisation of changes in a visual scene. Using a simple paradigm with an array of coloured squares, the present study aimed to further investigate differences in event-related potentials (ERPs) between trials in which participants could detect the presence of a colour change but not identify the location of the change (sense trials), versus those where participants could both detect and localise the change (localise trials). Individual differences in performance were controlled for by adjusting the difficulty of the task in real time.
View Article and Find Full Text PDFFront Hum Neurosci
October 2017
Beat perception is fundamental to how we experience music, and yet the mechanism behind this spontaneous building of the internal beat representation is largely unknown. Existing findings support links between the tempo (speed) of the beat and enhancement of electroencephalogram (EEG) activity at tempo-related frequencies, but there are no studies looking at how tempo may affect the underlying long-range interactions between EEG activity at different electrodes. The present study investigates these long-range interactions using EEG activity recorded from 21 volunteers listening to music stimuli played at 4 different tempi (50, 100, 150 and 200 beats per minute).
View Article and Find Full Text PDFThe physical environment leads to a thermal sensation that is perceived and appraised by occupants. The present study focuses on the relationship between sensation and evaluation. We asked 166 people to recall a thermal event from their recent past.
View Article and Find Full Text PDFWe are sympathetic with Bentley et al.'s attempt to encompass the wisdom of crowds in a generative model, but posit that a successful attempt at using big data will include more sensitive measurements, more varied sources of information, and will also build from the indirect information available through technology, from ancillary technical features to data from brain-computer interfaces.
View Article and Find Full Text PDFEmotional reactivity and the time taken to recover, particularly from negative, stressful, events, are inextricably linked, and both are crucial for maintaining well-being. It is unclear, however, to what extent emotional reactivity during stimulus onset predicts the time course of recovery after stimulus offset. To address this question, 25 participants viewed arousing (negative and positive) and neutral pictures from the International Affective Picture System (IAPS) followed by task-relevant face targets, which were to be gender categorized.
View Article and Find Full Text PDFIn this article, we present FACSGen 2.0, new animation software for creating static and dynamic three-dimensional facial expressions on the basis of the Facial Action Coding System (FACS). FACSGen permits total control over the action units (AUs), which can be animated at all levels of intensity and applied alone or in combination to an infinite number of faces.
View Article and Find Full Text PDFTo investigate the mechanisms involved in automatic processing of facial expressions, we used the QUEST procedure to measure the display durations needed to make a gender decision on emotional faces portraying fearful, happy, or neutral facial expressions. In line with predictions of appraisal theories of emotion, our results showed greater processing priority of emotional stimuli regardless of their valence. Whereas all experimental conditions led to an averaged threshold of about 50 ms, fearful and happy facial expressions led to significantly less variability in the responses than neutral faces.
View Article and Find Full Text PDFThe goal of this study was to examine behavioral and electrophysiological correlates of involuntary orienting toward rapidly presented angry faces in non-anxious, healthy adults using a dot-probe task in conjunction with high-density event-related potentials and a distributed source localization technique. Consistent with previous studies, participants showed hypervigilance toward angry faces, as indexed by facilitated response time for validly cued probes following angry faces and an enhanced P1 component. An opposite pattern was found for happy faces suggesting that attention was directed toward the relatively more threatening stimuli within the visual field (neutral faces).
View Article and Find Full Text PDFFor more than half a century, emotion researchers have attempted to establish the dimensional space that most economically accounts for similarities and differences in emotional experience. Today, many researchers focus exclusively on two-dimensional models involving valence and arousal. Adopting a theoretically based approach, we show for three languages that four dimensions are needed to satisfactorily represent similarities and differences in the meaning of emotion words.
View Article and Find Full Text PDF