检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王静红[1,2,3] 王慧 WANG Jinghong;WANG Hui(College of Computer and Cyber Security,Hebei Normal University,Shijiazhuang 050024,China;Hebei Provincial Key Laboratory of Network and Information Security,Shijiazhuang 050024,China;Hebei Provincial Engineering Research Center for Supply Chain Big Data Analytics and Data Security,Shijiazhuang 050024,China)
机构地区:[1]河北师范大学计算机与网络空间安全学院,石家庄050024 [2]河北省网络与信息安全重点实验室,石家庄050024 [3]供应链大数据分析与数据安全河北省工程研究中心,石家庄050024
出 处:《计算机工程与应用》2024年第16期133-142,共10页Computer Engineering and Applications
基 金:中央引导地方科技发展资金项目(226Z1808G);河北省自然科学基金(F2021205014,F2019205303);河北省高等学校科学技术研究项目(ZD2022139);河北师范大学重点项目(L2023J05);河北师范大学研究生创新资助项目(XCXZZSS202315)。
摘 要:现实世界中越来越多的复杂数据被表示为具有属性节点的图,因此属性图聚类是图挖掘中的一个重要问题。图神经网络在图结构数据的编码表示方面取得较好性能,但基于卷积操作或者注意力机制的图神经网络方法存在节点噪声、特征过度平滑、网络异质性、计算代价高昂等问题。基于深度学习方法如自编码器能够有效地提取节点属性表示,但不能包含丰富结构信息。因此提出了一种基于自监督训练和对比学习的图联合表示聚类方法(self-supervised contrastive graph joint representation clustering,SCRC)。使用自编码器预训练学习节点的属性表示,通过在图结构信息上增加对比损失信息,使用影响对比损失融合更加丰富的结构信息,联合图结构信息和属性表示,基于神经网络自监督训练机制迭代优化完成聚类任务。通过设计简单的线性模型,避免使用卷积和注意力机制,有效整合结构信息,使得运行速度更快。在广泛使用的引文网络数据上进行实验,对参数敏感性进行分析,验证了影响对比损失和自监督联合聚类的有效性。实验结果表明,所提出的方法取得了显著的性能提升,并且对节点噪声、特征过度平滑和网络异质性更具有鲁棒性。Attributed graph clustering is a significant problem in graph mining,as more and more complex data in the real world have been represented in graphs with attributed nodes.Graph neural network has shown good performance in coding and representing graph-structural data.However,the graph neural network based on convolution operation or attention mechanism has problems such as node noise,feature over-smoothing and network heterogeneity,and the computational cost is expensive.Although deep learning methods such as auto-encoders can effectively extract node feature representation,they cannot contain rich structural information.Instead,this paper proposes a self-supervised contrastive graph joint representation clustering(SCRC)method.Firstly,the method uses auto-encoders to pretrain the nodes’attribute representations.Secondly,the method adds contrastive loss information to the graph-structural information,uses influence contrastive loss to fuse richer structural information.Then,the method combines the graph structural information and attribute representations,and performs iterative optimization based on the self-supervised training mechanism of the neural network to complete the clustering task.The method is designed as a simple linear model to integrate structural information effectively.It runs faster than the graph neural networks using convolution and attention mechanisms.The method has experimented with the widely used benchmark citation network to verify the effectiveness of influence contrastive loss and self-supervised clustering through experimental sensitivity analysis of parameters.The experimental results achieve significant performance gains and are more robust to node noise,feature over-smoothing and network heterogeneity.
关 键 词:属性图聚类 自监督训练 对比学习 自编码器 联合表示学习
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.31