Pre-Train and Learn: Preserving Global Information for Graph Neural Networks  

在线阅读下载全文

作  者:Dan-Hao Zhu Xin-Yu Dai Jia-Jun Chen 

机构地区:[1]Library,Jiangsu Police Institute,Nanjing 210031,China [2]Department of Computer Science and Technology,Nanjing University,Nanjing 210093,China

出  处:《Journal of Computer Science & Technology》2021年第6期1420-1430,共11页计算机科学技术学报(英文版)

基  金:partially supported by the Natural Science Foundation of the Jiangsu Higher Education Institutions of China under Grant No.18kJB510010;the Social Science Foundation of Jiangsu Province of China under Grant No.19TQD002;the National Nature Science Foundation of China under Grant No.61976114.

摘  要:Graph neural networks(GNNs)have shown great power in learning on graphs.However,it is still a challenge for GNNs to model information faraway from the source node.The ability to preserve global information can enhance graph representation and hence improve classification precision.In the paper,we propose a new learning framework named G-GNN(Global information for GNN)to address the challenge.First,the global structure and global attribute features of each node are obtained via unsupervised pre-training,and those global features preserve the global information associated with the node.Then,using the pre-trained global features and the raw attributes of the graph,a set of parallel kernel GNNs is used to learn different aspects from these heterogeneous features.Any general GNN can be used as a kernal and easily obtain the ability of preserving global information,without having to alter their own algorithms.Extensive experiments have shown that state-of-the-art models,e.g.,GCN,GAT,Graphsage and APPNP,can achieve improvement with G-GNN on three standard evaluation datasets.Specially,we establish new benchmark precision records on Cora(84.31%)and Pubmed(80.95%)when learning on attributed graphs.

关 键 词:graph neural network network embedding representation learning global information pre-train 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象