25篇深度GNN最新研究,多视角方案解读
本文转载自【AI机器学习与知识图谱】公众号
在计算机视觉中,模型CNN随着其层次加深可以学习到更深层次的特征信息,叠加64层或128层是十分正常的现象,且能较浅层取得更优的效果。
图卷积神经网络GCNs是一种针对图结构数据的深度学习方法,但是目前大多数的GCN模型都是浅层的,如GCN,GAT模型都是在2层时取得最优效果,随着加深模型效果就会大幅度下降;GCN随着模型层次加深会出现Over-Smoothing问题,Over-Smoothing既相邻的节点随着网络变深就会越来越相似,最后学习到的nodeembedding便无法区分,模型效果下降。
为什么要将GNN做深,DeeperGNN适用于解决什么问题:少标签半监督节点分类;少特征半监督节点分类。下面再给出几个概念解释。
1. Over-fitting:在CNN卷积神经网络中,若CNN网络结构过于复杂过于Deep,且数据量有限的情况下,便会出现Over-fitting问题,Over-fitting就是指模型对于训练数据过度学习,学习到训练数据本身而不是训练数据的规律,导致无法在测试集上准确预测的情况。
2. Over-Smoothing:在GNN图神经网络中,由于图本身结构上节点与节点之间相互连接的特性,并且图神经网络一般是通过邻域汇聚或随机游走的方式进行表征学习,因此当图网络一旦变深,便会出现Over-Smoothing问题,Over-Smoothing指的是随着图神经网络加深,学习到的节点表征越来越相似,以至于无法区分,模型效果也将大幅下降。且在图网络中一般2 Layers时效果最佳。因此如何在DeepGNN中既能学到更深层次信息又能避免Over-Smoothing显得至关重要。
▌必读系列4篇
【GCNII】Simple and Deep Graph Convolutional Networks [ICML 2020]
【GRAND】Graph Random Neural Networks for Semi-Supervised Learning on Graphs [NeurIPS 2020]
【DAGNN】Towards Deeper Graph Neural Networks [KDD 2020]
【APPNP】Predict then Propagate: Graph Neural Networks meet Personalized PageRank [ICLR 2019]
▌Guohao Li系列3篇
Guohao Li一直在跟进深度GNN的研究,必读。
【Guohao Li】个人首页:https://ghli.org/【Guohao Li】
DeepGCNs: Can GCNs Go as Deep as CNNs? [ICCV 2019]
【Guohao Li】DeeperGCN: All You Need to Train Deeper GCNs [arXiv 2020]
【Guohao Li】Training Graph Neural Networks with 1000 Layers [ICML 2021]
▌2021年最新4篇推荐
Adaptive Universal Generalized PageRank Graph Neural Network [ICLR 2021]
Graph Neural Networks Inspired by Classical Iterative Algorithms [ICML 2021]
AdaGCN: Adaboosting Graph Convolutional Networks into Deep Models [ICLR 2021]
Adaptive Universal Generalized PageRank Graph Neural Network [ICLR 2021]
▌2020年10篇推荐
【DropEdge】Towards Deep Graph Convolutional Networks on Node Classification [ICLR 2020]
【PairNorm】Tackling Oversmoothing in GNNs [ICLR 2020]
Towards Deeper Graph Neural Networks with Differentiable Group Normalization [NeurIPS 2020]
Scattering GCN: Overcoming Oversmoothness in Graph Convolutional Networks [NeurIPS 2020]
Bayesian Graph Neural Networks with Adaptive Connection Sampling [ICML 2020]
Continuous Graph Neural Networks [ICML 2020]
Graph Neural Networks Exponentially Lose Expressive Power for Node Classification [ICLR 2020]
Measuring and Improving the Use of Graph Information in Graph Neural Networks [ICLR 2020]
Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View [AAAI 2020]
【JK-Net】Representation Learning on Graphs with Jumping Knowledge Networks [ICML 2018]
▌其他4篇
Deep Graph Neural Networks with Shallow Subgraph Samplers [arXiv 2020]
Tackling Over-Smoothing for General Graph Convolutional Networks [arXiv 2020]
Effective Training Strategies for Deep Graph Neural Networks [arXiv 2020]
Revisiting Over-smoothing in Deep GCNs [arXiv 2020]
关注【学姐带你玩AI】
论文推荐应有尽有
