检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王静红[1,2,3] 郑瑞策 米据生 李昊康[5] WANG Jinghong;ZHENG Ruice;MI Jusheng;LI Haokang(College of Computer and Cyber Security,Hebei Normal University,Shijiazhuang 050024,China;Hebei Key Laboratory of Network and Information Security,Hebei Normal University,Shijiazhuang 050024,China;Hebei Provincial Engineering Research Centerfor Supply Chain Big Data Analytics&Data Security,Shijiazhuang 050024,China;School of Mathematical Science,Hebei Normal University,Shijiazhuang 050024,China;College of Artificial Intelligence and Big Data,Hebei University of Engineering and Technology,Shijiazhuang 050091,China)
机构地区:[1]河北师范大学计算机与网络空间安全学院,河北石家庄050024 [2]河北师范大学河北省网络与信息安全重点实验室,河北石家庄050024 [3]供应链大数据分析与数据安全河北省工程研究中心,河北石家庄050024 [4]河北师范大学数学科学学院,河北石家庄050024 [5]河北工程技术学院人工智能与大数据学院,河北石家庄050091
出 处:《山西大学学报(自然科学版)》2025年第1期29-42,共14页Journal of Shanxi University(Natural Science Edition)
基 金:河北省科学基金资助项目(F20242050280);河北省高等学校科学技术研究项目(ZD2022139)。
摘 要:网络表示学习是网络分析任务的基础,对于挖掘和分析真实网络数据具有重大意义。最近,图注意力网络(Graph Attention Networks,GAT)及其后续变体,在网络表示学习中表现出了卓越的性能。但是基于注意力的方法存在以下局限性:(1)只考虑节点的一阶邻居信息,忽略了高阶邻居节点。(2)模型缺乏可解释性。(3)没有考虑到图中噪声边问题。为解决这些问题,本文提出了一种基于结构学习和自监督图注意力的网络嵌入模型(Structural Learning-based Self-supervised Graph Attention Network,SL-SGAT),融合节点特征与结构信息,降低噪声边干扰,提升模型可解释性。SL-SGAT主要包含三部分:图结构学习、自监督注意力机制和特征聚合。图结构学习构建全局图结构网络。自监督注意力机制设置一个自监督关系预测任务,加入噪声边损失。特征聚合利用注意力系数进行加权聚合,得到最终的节点嵌入表示。本文所提模型在Cora、Citeseer和Pubmed三个数据集上进行节点分类实验,准确率分别为84.4%、74.4%、81.5%,与表现优异的GAT及后续变体模型相比,分别提高1.4%、2.9%、3.2%;在节点聚类实验中,聚类精度分别提高3.3%、3.4%、1.2%。可见,我们提出的算法可以得到更好的嵌入结果。Network representation learning is the foundation of network analysis tasks,that holds significant importance in mining and analyzing the real network data.Recently,Graph Attention Networks(GAT)and their variants have shown exceptional performance.This is particularly evident in the field of network representation learning.However,attention-based methods have the following limitations:(1)Only first-order neighbor information of nodes is considered,ignoring higher-order neighbors.(2)The model lacks interpretability.(3)The issue of noisy edges in the graph is not considered.To tackle these issues,this paper proposes a Structural Learning and Self-supervised Graph Attention Network Embedding Model(SL-SGAT),which integrates node features and structural information,reduces noise edge interference,and enhances model interpretability.SL-SGAT mainly consists of three parts:graph structure learning,self-supervised attention mechanism,and feature aggregation.Graph structure learning constructs a global graph structure network.The self-supervised attention mechanism sets up a self-supervised relation prediction task,with noise edge loss added.Feature aggregation utilizes attention coefficients for weighted aggregation to obtain the final node embedding representation.The model proposed in this paper was tested on the Cora,Citeseer,and Pubmed datasets for node classification tasks,achieving accuracies of 84.4%,74.4%,and 81.5%,respectively.Compared to the high-performing GAT and its subsequent variants,our model shows improvements of 1.4%,2.9%,and 3.2%,respectively.In the node clustering experiments,the clustering accuracy improved by 3.3%,3.4%,and 1.2%.These results demonstrate that our proposed algorithm can achieve a better representation of node embedding.
关 键 词:网络表示学习 图注意力网络 自监督学习 图结构学习 节点分类
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15