Sparse graphs-based dynamic attention networks.

Heliyon

Department of Automation, Xiamen University, Xiamen, 361005, China.

Published: August 2024

In previous research, the prevailing assumption was that Graph Neural Networks (GNNs) precisely depicted the interconnections among nodes within the graph's architecture. Nonetheless, real-world graph datasets are often rife with noise, elements that can disseminate through the network and ultimately affect the outcome of the downstream tasks. Facing the complex fabric of real-world graphs and the myriad potential disturbances, we introduce the Sparse Graph Dynamic Attention Networks (SDGAT) in this research. SDGAT employs the regularization technique to achieve a sparse representation of the graph structure, which eliminates noise and generates a more concise sparse graph. Building upon this foundation, the model integrates a dynamic attention mechanism, allowing it to selectively focus on key nodes and edges, filter out irrelevant data, and simultaneously facilitate effective feature aggregation with important neighbors. To evaluate the performance of SDGAT, we conducted experiments on three citation datasets and compared its performance against commonly employed models. The outcomes indicate that SDGAT excels in node classification tasks, notably on the Cora dataset, with an accuracy rate of 85.29%, marking a roughly 3% enhancement over the majority of baseline models. The experimental findings provide evidence that SDGAT delivers effective performance on all three citation datasets, underscoring the efficacy of the dynamic attention network built upon a sparse graph.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11647946PMC
http://dx.doi.org/10.1016/j.heliyon.2024.e35938DOI Listing

Publication Analysis

Top Keywords

dynamic attention
16
sparse graph
12
attention networks
8
three citation
8
citation datasets
8
graph
6
sparse
5
sdgat
5
sparse graphs-based
4
dynamic
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!