Publications by authors named "Yaomin Chang"

Article Synopsis
  • Graph neural networks (GNNs) excel in graph representation learning but struggle with scalability issues when handling real-world graph data due to high computational demands.
  • Current scalable GNN solutions often compromise either scalability or performance, failing to address both challenges effectively.
  • The proposed KD-SGNN enhances GNNs' scalability and effectiveness by using knowledge distillation techniques and a decoupled architecture, with successful evaluations on various real datasets.
View Article and Find Full Text PDF

Graph convolutional networks (GCNs) have achieved great success in many applications and have caught significant attention in both academic and industrial domains. However, repeatedly employing graph convolutional layers would render the node embeddings indistinguishable. For the sake of avoiding oversmoothing, most GCN-based models are restricted in a shallow architecture.

View Article and Find Full Text PDF