Objective: Time perception is fundamental for human experience. A topic which has attracted the attention of researchers for long time is how the stimulus sensory modality (e.g., images vs. sounds) affects time judgments. However, so far, no study has directly compared the effect of two sensory modalities using emotional stimuli on time judgments.
Methods: In the present two studies, healthy participants were asked to estimate the duration of a pure sound preceded by the presentation of odors vs. emotional videos as priming stimuli (implicit emotion-eliciting task). During the task, skin conductance (SC) was measured as an index of arousal.
Results: Olfactory stimuli resulted in an increase in SC and in a constant time overestimation. Video stimuli resulted in an increase in SC (emotional arousal), which decreased linearly overtime. Critically, video stimuli resulted in an initial time underestimation, which shifted progressively towards a time overestimation. These results suggest that video stimuli recruited both arousal-related and attention-related mechanisms, and that the role played by these mechanisms changed overtime.
Conclusions: These pilot studies highlight the importance of comparing the effect of different kinds on temporal estimation tasks, and suggests that odors are well suited to investigate arousal-related temporal distortions, while videos are ideal to investigate both arousal-related and attention-related mechanisms.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4464069 | PMC |
http://dx.doi.org/10.3389/fnbeh.2015.00143 | DOI Listing |
PLoS One
January 2025
Department of Psychology, Theoretical Cognitive Science Group, Philipps-Universität Marburg, Marburg, Germany.
Introduction: To interact with the environment, it is crucial to distinguish between sensory information that is externally generated and inputs that are self-generated. The sensory consequences of one's own movements tend to induce attenuated behavioral- and neural responses compared to externally generated inputs. We propose a computational model of sensory attenuation (SA) based on Bayesian Causal Inference, where SA occurs when an internal cause for sensory information is inferred.
View Article and Find Full Text PDFBrain Sci
January 2025
School of Health Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China.
Backgrounds: Virtual reality (VR) has become a transformative technology with applications in gaming, education, healthcare, and psychotherapy. The subjective experiences in VR vary based on the virtual environment's characteristics, and electroencephalography (EEG) is instrumental in assessing these differences. By analyzing EEG signals, researchers can explore the neural mechanisms underlying cognitive and emotional responses to VR stimuli.
View Article and Find Full Text PDFNeuroSci
January 2025
Psychological Neuroscience Laboratory, Psychology Research Center, School of Psychology, University of Minho, Rua da Universidade, 4710-057 Braga, Portugal.
Human point-light displays consist of luminous dots representing human articulations, thus depicting actions without pictorial information. These stimuli are widely used in action recognition experiments. Because humans excel in decoding human motion, point-light displays (PLDs) are often masked with additional moving dots (noise masks), thereby challenging stimulus recognition.
View Article and Find Full Text PDFCommun Biol
January 2025
Western Institute for Neuroscience, Western University, London, ON, Canada.
Our brain seamlessly integrates distinct sensory information to form a coherent percept. However, when real-world audiovisual events are perceived, the specific brain regions and timings for processing different levels of information remain less investigated. To address that, we curated naturalistic videos and recorded functional magnetic resonance imaging (fMRI) and electroencephalography (EEG) data when participants viewed videos with accompanying sounds.
View Article and Find Full Text PDFMem Cognit
January 2025
Department of Linguistics, University of California San Diego, 9500 Gilman Drive, La Jolla, CA, 92093-0108, USA.
Research shows that insufficient language access in early childhood significantly affects language processing. While the majority of this work focuses on syntax, phonology also appears to be affected, though it is unclear exactly how. Here we investigated phonological production across age of acquisition of American Sign Language (ASL).
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!