self-supervised pre-train + few-shot on down-tasks
之前的图生成网络VGAE等,不适合做pre-train:
frame-work:

contribution:
和以往的图生成不同,GPT-GNN的预训练有着a harder graph task and thus can guide the model to learn more complex semantics and structure of the input graph
Our work is different with Strategies for Pre-training GNN[STRATEGIES FOR PRE-TRAINING GRAPH NEURAL NETWORKS ] as our goal is to pre-train GNNs over a single (large-scale) graph and conduct the node-level transfer.