GPT-GNN: Generative Pre-Training of Graph Neural Networks - CrossMinds.ai
GPT-GNN: Generative Pre-Training of Graph Neural Networks
Aug 13, 20206 views
Ziniu Hu
Graph neural networks (GNNs) have been demonstrated to be,powerful in modeling graph-structured data. However, training,GNNs usually requires abundant task-specific labeled data, which,is often arduously expensive to obtain. One effective way to reduce,the labeling effort is to pre-train an expressive GNN model on unlabeled data with self-supervision and then transfer the learned,model to downstream tasks with only a few labels. In this paper,,we present the GPT-GNN,∗,framework to initialize GNNs by generative pre-training. GPT-GNN introduces a self-supervised attributed,graph generation task to pre-train a GNN so that it can capture,the structural and semantic properties of the graph. We factorize,the likelihood of the graph generation into two components: 1),Attribute Generation and 2) Edge Generation. By modeling both,components, GPT-GNN captures the inherent dependency between,node attributes and graph structure during the generative process.,Comprehensive experiments on the billion-scale Open Academic,Graph and Amazon recommendation data demonstrate that GPTGNN significantly outperforms state-of-the-art GNN models without pre-training by up to 9.1% across various downstream tasks.
SIGKDD_2020
Recommended