基于可微分采样和注意力机制的图神经网络优化  

Optimization of GNN with Differentiable Sampling and Attention Mechanism

在线阅读下载全文

作  者:张爱玲 庞慧[1,2] 严鑫瑜 ZHANG Ailing;PANG Hui;YAN Xinyu(Hebei University of Architecture,Zhangjiakou,Hebei 075000;Big Data Technology Innovation Center of Zhangjiakou,Zhangjiakou,Hebei 075000)

机构地区:[1]河北建筑工程学院,河北张家口075000 [2]张家口市大数据技术创新中心,河北张家口075000

出  处:《河北建筑工程学院学报》2024年第4期255-260,共6页Journal of Hebei Institute of Architecture and Civil Engineering

摘  要:随着图神经网络(Graph Neural Networks,GNN)在处理复杂图数据方面的应用越来越广泛,如何有效提高其性能和泛化能力成为了研究重点。当前,GNN面临的主要问题包括高计算复杂度和低采样效率,这限制了其在大规模图数据上的应用。此外,现有的GNN模型往往忽略了节点间的差异性,不能充分利用图节点的重要性信息。故提出了一种结合可微分采样和节点级注意力机制的改进方法,以提升GNN模型的性能。首先,通过可微分采样策略,在训练过程中动态调整采样概率,使模型能够针对性地选择更具信息量的节点进行计算。其次,引入节点级注意力机制,允许模型在特征提取和信息聚合过程中自适应地关注重要节点。实验结果显示,本改进方法在Cora、Citeseer、Pubmed等公开数据集上显著提高了GNN模型的分类准确率和对不同图结构的泛化能力。With the wide application of Graph Neural Networks(GNNS)in processing complex graph data,how to effectively improve their performance and generalization ability has become a research focus.At present,the main problems faced by GNN include high computational complexity and low sampling efficiency,which limit its application on large-scale graph data.In addition,the existing GNN models often ignore the differences between nodes and cannot make full use of the importance information of graph nodes.In this paper,an improved method combining differentiable sampling and node-level attention mechanism is proposed to improve the performance of GNN models.Firstly,through the differentiable sampling strategy,the sampling probability was dynamically adjusted during the training process,so that the model could pertinently select more informative nodes for calculation.Secondly,a node-level attention mechanism is introduced to allow the model to adaptively focus on important nodes in the process of feature extraction and information aggregation.Experimental results on Cora,Citeseer,Pubmed and other public datasets show that the proposed method significantly improves the classification accuracy of GNN models and the generalization ability to different graph structures.

关 键 词:图神经网络 可微分采样 节点级注意力机制 大规模图数据处理 

分 类 号:TP389.1[自动化与计算机技术—计算机系统结构]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象