BHGAttN: A Feature-Enhanced Hierarchical Graph Attention Network for Sentiment Analysis.

Entropy (Basel)

Division of Software Convergence, Cheongju University, Cheongju 28503, Republic of Korea.

Published: November 2022

Recently, with the rise of deep learning, text classification techniques have developed rapidly. However, the existing work usually takes the entire text as the modeling object and pays less attention to the hierarchical structure within the text, ignoring the internal connection between the upper and lower sentences. To address these issues, this paper proposes a Bert-based hierarchical graph attention network model (BHGAttN) based on a large-scale pretrained model and graph attention network to model the hierarchical relationship of texts. During modeling, the semantic features are enhanced by the output of the intermediate layer of BERT, and the multilevel hierarchical graph network corresponding to each layer of BERT is constructed by using the dependencies between the whole sentence and the subsentence. This model pays attention to the layer-by-layer semantic information and the hierarchical relationship within the text. The experimental results show that the BHGAttN model exhibits significant competitive advantages compared with the current state-of-the-art baseline models.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9689255PMC
http://dx.doi.org/10.3390/e24111691DOI Listing

Publication Analysis

Top Keywords

hierarchical graph
12
graph attention
12
attention network
12
pays attention
8
network model
8
hierarchical relationship
8
layer bert
8
hierarchical
6
attention
5
model
5

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!