Publications by authors named "Baole Fu"

Emotion recognition based on electroencephalogram (EEG) signals is crucial in understanding human affective states. Current research has limitations in extracting local features. The representation capabilities of local features are limited, making it difficult to comprehensively capture emotional information.

View Article and Find Full Text PDF

Multimodal emotion recognition research is gaining attention because of the emerging trend of integrating information from different sensory modalities to improve performance. Electroencephalogram (EEG) signals are considered objective indicators of emotions and provide precise insights despite their complex data collection. In contrast, eye movement signals are more susceptible to environmental and individual differences but offer convenient data collection.

View Article and Find Full Text PDF

Emotion recognition is a challenging task, and the use of multimodal fusion methods for emotion recognition has become a trend. Fusion vectors can provide a more comprehensive representation of changes in the subject's emotional state, leading to more accurate emotion recognition results. Different fusion inputs or feature fusion methods have varying effects on the final fusion outcome.

View Article and Find Full Text PDF