This research introduces the Variational Graph Attention Dynamics (VarGATDyn), addressing the complexities of dynamic graph representation learning, where existing models, tailored for static graphs, prove inadequate. VarGATDyn melds attention mechanisms with a Markovian assumption to surpass the challenges of maintaining temporal consistency and the extensive dataset requirements typical of RNN-based frameworks. It harnesses the strengths of the Variational Graph Auto-Encoder (VGAE) framework, Graph Attention Networks (GAT), and Gaussian Mixture Models (GMM) to adeptly navigate the temporal and structural intricacies of dynamic graphs.
View Article and Find Full Text PDFAlthough graph representation learning has been studied extensively in static graph settings, dynamic graphs are less investigated in this context. This paper proposes a novel integrated variational framework called DYnamic mixture Variational Graph Recurrent Neural Networks (DyVGRNN), which consists of extra latent random variables in structural and temporal modelling. Our proposed framework comprises an integration of Variational Graph Auto-Encoder (VGAE) and Graph Recurrent Neural Network (GRNN) by exploiting a novel attention mechanism.
View Article and Find Full Text PDF