Depression is a severe psychological condition that affects millions of people worldwide. As depression has received more attention in recent years, it has become imperative to develop automatic methods for detecting depression. Although numerous machine learning methods have been proposed for estimating the levels of depression via audio, visual, and audiovisual emotion sensing, several challenges still exist.
View Article and Find Full Text PDFPrevious studies have shown that visual attention effect can spread to the task-irrelevant auditory modality automatically through either the stimulus-driven binding process or the representation-driven priming process. Using an attentional blink paradigm, the present study investigated whether the long-latency stimulus-driven and representation-driven cross-modal spread of attention would be inhibited or facilitated when the attentional resources operating at the post-perceptual stage of processing are inadequate, whereas ensuring all visual stimuli were spatially attended and the representations of visual target object categories were activated, which were previously thought to be the only endogenous prerequisites for triggering cross-modal spread of attention. The results demonstrated that both types of attentional spreading were completely suppressed during the attentional blink interval but were highly prominent outside the attentional blink interval, with the stimulus-driven process being independent of, whereas the representation-driven process being dependent on, audiovisual semantic congruency.
View Article and Find Full Text PDFThe present study recorded event-related potentials (ERPs) in a visual object-recognition task under the attentional blink paradigm to explore the temporal dynamics of the cross-modal boost on attentional blink and whether this auditory benefit would be modulated by semantic congruency between T2 and the simultaneous sound. Behaviorally, the present study showed that not only a semantically congruent but also a semantically incongruent sound improved T2 discrimination during the attentional blink interval, whereas the enhancement was larger for the congruent sound. The ERP results revealed that the behavioral improvements induced by both the semantically congruent and incongruent sounds were closely associated with an early cross-modal interaction on the occipital N195 (192-228 ms).
View Article and Find Full Text PDFTwo identical visual disks moving toward each other on a two-dimensional (2D) display are more likely to be perceived as "streaming through" than "bouncing off" each other after their coincidence. However, either a brief auditory tone or visual flash presented at the coincident moment of the disks can strikingly increase the incidence of the bouncing percept. Despite the neural substrates underlying the sound-induced bouncing effect have been widely investigated, little is known about the neural mechanisms underlying the flash-induced bouncing effect.
View Article and Find Full Text PDF