基于三元互信息的图对比学习方法研究  

Research on graph contrastive learning method based on ternary mutual information

在线阅读下载全文

作  者:李旭[1] 蔡彪[1] 胡能兵 LI Xu;CAI Biao;HU Nengbing(College of Computer Science and Cyber Security,Chengdu University of Technology,Chengdu 610059,China)

机构地区:[1]成都理工大学计算机与网络安全学院,四川成都610059

出  处:《智能系统学报》2024年第5期1257-1267,共11页CAAI Transactions on Intelligent Systems

基  金:国家自然科学基金项目(61802034).

摘  要:最近,图对比学习成为一种成功的无监督图表示学习方法,大多数方法都基于最大化互信息原则,通过数据增强来得到两个视图,并最大化两个视图的互信息。然而,两个视图的互信息可能包含不利于下游任务的信息。为了克服这些缺陷,提出基于三元互信息的图对比学习框架。该框架首先对输入图进行随机数据增强来生成两个视图,使用权重共享的编码器获得两个节点表示矩阵,随后使用共享权重解码器解码两个视图的节点表示。通过对比损失函数分别计算视图之间和视图与原图之间的损失,以最大化视图之间和视图与原图之间的互信息。实验结果表明,该方法在节点分类准确性方面的表现优于基线方法,甚至超过部分监督学习方法,验证了框架的有效性。Recently,graph contrastive learning has emerged as a successful method for unsupervised graph representation learning.Most existing methods are based on the principle of maximizing mutual information by obtaining two views through data augmentation and maximizing their mutual information.However,the mutual information of these two views may include information that is not beneficial for downstream tasks.To overcome these shortcomings,we propose a graph contrastive learning framework based on ternary mutual information.The framework first performs stochastic data augmentation on the input graph to generate two views.A weight-shared encoder is then used to obtain two node representation matrices.Subsequently,a shared weight decoder decodes the node representations of the two views.The loss between views,and between views and the original graph,is calculated separately using a contrast loss function.This approach maximizes the mutual information between the views,as well as between views and the original graph.The experimental results show that the method outperforms the baseline method in terms of node classification accuracy.Moreover,it even outperforms some supervised machine learning methods,verifying the effectiveness of the framework.

关 键 词:图对比学习 互信息 图神经网络 无监督学习 对比学习 表示学习 节点分类 深度学习 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象