GCC图神经网络预训练概述

Paper

GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training

Conference: KDD

Keywords: Contrastive Learning, Instance Discrimination, Transferability, Pre-training

Recent Work

However, most representation learning work on graphs has thus far focused on learning representations for one single graph or a fixed set of graphs and very limited work can be transferred to out-of-domain data and tasks.

Challenge

How to design the pre-training task such that the universal structural patterns in and across networks can be captured and further transferred?

Main idea

The idea of pre-training is to use the pre-trained model as a good initialization for fine-tuning over (different) tasks on unseen datasets. Contrastive learning is another idea. Here are three critical steps. (1) Define instances in graphs (2) Define similar instance pairs in and across graphs (3) Choose the proper graph encoders.

Contributions

(1) Formalize the problem of GNN pre-training
(2) Regard pre-training as instance discrimination to capture the universal and transferable structural patterns from multiple input graphs
(3) GCC framework is proposed
(4) Experiments

Requirements

(1) Structural Similarity, it maps vertices with similar local network topologies close to each other in the vector space.
(2) Transferability, it is compatible with vertices and graphs unseen by the pre-training algorithm.

Steps

Ideally, given a (diverse) set of input graphs, such as the Facebook social graph and the DBLP co-author graph, we aim to pre-train a GNN on them with a self-supervised task, and then fine-tune it on different graphs for different graph learning tasks, such as node classification on the US-Airport graph.

Compared Algorithms

RolX
Panther++
GraphWave

Some thoughts

Combine attributes
BERT-GRAPH
Remove nodes with noise
Aggregate nodes as a new instance

More Papers About Pre-Training

Strategies for Pre-training Graph Neural Networks. 2020. ICLR
GPT-GNN: Generative Pre-Training of Graph Neural Networks. 2020. KDD
Pre-TrainingGraphNeuralNetworksfor Generic Structural Feature Extraction. 2020
Self-supervised Learning: Generative or Contrastive. 2020.
Gaining insight into SARS-CoV-2 infection and COVID-19 severity using self-supervised edge features and Graph Neural Networks. 2020. ICML
When Does Self-Supervision Help Graph Convolutional Networks? 2020. ICML
Multi-Stage Self-Supervised Learning for Graph Convolutional Networks on Graphs with Few Labeled Nodes. 2020. AAAI
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training. 2020. KDD
Self-Supervised Graph Representation Learning via Global Context Prediction. 2020.
Contrastive Multi-View Representation Learning on Graphs. 2020.
Self-supervised Training of Graph Convolutional Networks. 2020.
Self-supervised Learning on Graphs: Deep Insights and New Directions. 2020.
GRAPH-BERT: Only Attention is Needed for Learning Graph Representations. 2020.
Graph Neural Distance Metric Learning with GRAPH-BERT. 2020.
Segmented GRAPH-BERT for Graph Instance Modeling. 2020.

Reference

https://zhuanlan.zhihu.com/p/150456349


关键词: 对比学习,实例辨析,可转移性,预训练

图神经网络预训练的图对比编码

最近工作:
然而,到目前为止,大多数关于图的表示学习工作都集中在学习单个图或一组固定图的表示,非常有限的工作可以转移到域外的数据和任务。

挑战:
如何设计预训练任务,使网络内和网络间的通用结构模式被捕获并进一步转移?

主要思想:
预训练的想法是使用预训练的模型作为一个良好的初始化,对未见数据集上的(不同的)任务进行微调。对比学习是另一种观点。这里有三个关键步骤。(1)在图中定义实例(2)在图中或图中定义相似的实例对(3)选择合适的图编码器。

 

贡献:
1. 将GNN预培训问题形式化
2. 将预训练作为实例识别,从多个输入图中获取通用的、可转移的结构模式
3.GCC框架的提出
4. 实验

要求:
(1)结构相似性,它将具有相似局部网络拓扑的顶点映射到向量空间中。
(2)可转移性,可以兼容预训练算法未发现的顶点和图。

步骤:
理想情况下,给定一组(多样化)的输入图,如Facebook的社交图和DBLP合著者图,我们的目标是上pre-train GNN他们self-supervised任务,然后调整图上不同的图表对不同学习任务,如在美国机场节点分类图。

算法相比:
RolX
Panther++
GraphWave

一些想法:
结合属性
BERT-GRAPH
去除带有噪声的节点
将节点聚合为一个新实例

 

 

更多关于预培训的论文:

Strategies for Pre-training Graph Neural Networks. 2020. ICLR
GPT-GNN: Generative Pre-Training of Graph Neural Networks. 2020. KDD
Pre-TrainingGraphNeuralNetworksfor Generic Structural Feature Extraction. 2020
Self-supervised Learning: Generative or Contrastive. 2020.
Gaining insight into SARS-CoV-2 infection and COVID-19 severity using self-supervised edge features and Graph Neural Networks. 2020. ICML
When Does Self-Supervision Help Graph Convolutional Networks? 2020. ICML
Multi-Stage Self-Supervised Learning for Graph Convolutional Networks on Graphs with Few Labeled Nodes. 2020. AAAI
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training. 2020. KDD
Self-Supervised Graph Representation Learning via Global Context Prediction. 2020.
Contrastive Multi-View Representation Learning on Graphs. 2020.
Self-supervised Training of Graph Convolutional Networks. 2020.
Self-supervised Learning on Graphs: Deep Insights and New Directions. 2020.
GRAPH-BERT: Only Attention is Needed for Learning Graph Representations. 2020.
Graph Neural Distance Metric Learning with GRAPH-BERT. 2020.
Segmented GRAPH-BERT for Graph Instance Modeling. 2020.

参考:
https://zhuanlan.zhihu.com/p/150456349

 

留下评论

您的电子邮箱地址不会被公开。 必填项已用*标注