Phys Rev E
Departamento de Física da Universidade de Aveiro & I3N, Campus Universitário de Santiago, 3810-193 Aveiro, Portugal.
Published: April 2017
Message passing equations yield a sharp percolation transition in finite graphs, as an artifact of the locally treelike approximation. For an arbitrary finite, connected, undirected graph we construct an infinite tree having the same local structural properties as this finite graph, when observed by a nonbacktracking walker. Formally excluding the boundary, this infinite tree is a generalization of the Bethe lattice. We indicate an infinite, locally treelike, random network whose local structure is exactly given by this infinite tree. Message passing equations for various cooperative models on this construction are the same as for the original finite graph, but here they provide the exact solutions of the corresponding cooperative problems. These solutions are good approximations to observables for the models on the original graph when it is sufficiently large and not strongly correlated. We show how to express these solutions in the critical region in terms of the principal eigenvector components of the nonbacktracking matrix. As representative examples we formulate the problems of the random and optimal destruction of a connected graph in terms of our construction, the nonbacktracking expansion. We analyze the limitations and the accuracy of the message passing algorithms for different classes of networks and compare the complexity of the message passing calculations to that of direct numerical simulations. Notably, in a range of important cases, simulations turn out to be more efficient computationally than the message passing.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1103/PhysRevE.95.042322 | DOI Listing |
Neural Netw
March 2025
State Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, 210023, China; Department of Computer Science and Technology, Nanjing University, Nanjing, 210023, China.
Graph Neural Networks (GNNs) have gained considerable prominence in semi-supervised learning tasks in processing graph-structured data, primarily owing to their message-passing mechanism, which largely relies on the availability of clean labels. However, in real-world scenarios, labels on nodes of graphs are inevitably noisy and sparsely labeled, significantly degrading the performance of GNNs. Exploring robust GNNs for semi-supervised node classification in the presence of noisy and sparse labels remains a critical challenge.
View Article and Find Full Text PDFMol Inform
March 2025
Faculty of Information Technology, HUTECH University, Ho Chi Minh City, Vietnam.
Within a recent decade, graph neural network (GNN) has emerged as a powerful neural architecture for various graph-structured data modelling and task-driven representation learning problems. Recent studies have highlighted the remarkable capabilities of GNNs in handling complex graph representation learning tasks, achieving state-of-the-art results in node/graph classification, regression, and generation. However, most traditional GNN-based architectures like GCN and GraphSAGE still faced several challenges related to the capability of preserving the multi-scaled topological structures.
View Article and Find Full Text PDFThe hypergraph neural network (HGNN) is an emerging powerful tool for modeling and learning complex, high-order correlations among entities upon hypergraph structures. While existing HGNN-based approaches excel in modeling high-order correlations among data using hyperedges, they often have difficulties in distinguishing diverse semantics ( e.g.
View Article and Find Full Text PDFIEEE Trans Med Imaging
February 2025
The registration of coronary artery structures from preoperative coronary computed tomography angiography to intraoperative coronary angiography is of great interest to improve guidance in percutaneous coronary interventions. However, non-rigid deformation and discrepancies in both dimensions and topology between the two imaging modalities present a challenge in the 2D/3D coronary artery registration. In this study, we address this problem by formulating it as a centerline feature matching task and propose a GNN-based vessel matching network (GVM-Net) to establish dense correspondence between different image modalities in an end-to-end manner.
View Article and Find Full Text PDFIEEE Trans Neural Netw Learn Syst
February 2025
Learning on temporal graphs has attracted tremendous research interest due to its wide range of applications. Some works intuitively merge graph neural networks (GNNs) and recurrent neural networks (RNNs) to capture structural and temporal information, and recent works propose to aggregate information from neighbor nodes in local subgraphs based on message passing or random walks. These methods produce node embeddings from a global or local perspective and ignore the complementarity between them, thus facing limitations in capturing complex and entangled dynamic patterns when applied to diverse datasets or evaluated by more challenging evaluation protocols.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!
© LitMetric 2025. All rights reserved.