检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:孙庆赟 罗家逸 杨贝宁 李建欣[1,2] Qingyun SUN;Jiayi LUO;Beining YANG;Jianxin LI(School of Computer Science and Engineering,Beihang University,Beijing 100191,China;Advanced Innovation Center for Big Data and Brain Computing,Beijing 100191,China)
机构地区:[1]北京航空航天大学计算机学院,北京100191 [2]北京市大数据与脑机智能高精尖创新中心,北京100191
出 处:《中国科学:信息科学》2024年第10期2409-2427,共19页Scientia Sinica(Informationis)
基 金:国家自然科学基金(批准号:62225202,62302023)资助项目。
摘 要:近年来,图神经网络在各个领域的图数据挖掘任务上取得了显著的成功,已成为领域的研究热点.图神经网络通过结构传播节点信息,并以此计算节点的表征,在大量应用场景上取得了显著的效果提升.大多数图神经网络模型遵循消息传递机制,直接将原始图数据作为输入,假设观测到的图结构准确地描述了节点之间完整的关系.然而,真实场景中图数据的产生往往受多种因素影响,包含大量随机噪声和人为扰动.这些噪声信息和干扰信息在图神经网络信息聚合的过程中随着图结构传播,对图表征质量产生严重的影响.如何度量、识别图数据中的噪声信息,是领域关注的热点问题之一.本文从信息论的角度出发,提出了一种非线性信息瓶颈指导的层次图结构学习方法NIB-HGSL,针对图层级分类任务,为去除结构噪声、学习鲁棒的图表征提供了一个统一通用的框架.NIB-HGSL通过有效信息保留与噪声信息压缩的均衡优化,可以获得对下游任务来说最关键的层次化最小充分图.实验结果表明,本文所提出的NIB-HGSL方法与其他基线方法相比,可提高图分类和图回归任务的准确性和鲁棒性.In recent years,graph neural networks(GNNs)have become extremely popular due to their powerful expressive capabilities and widespread availability.They have been successful in various real-world applications.GNNs work by learning the node representations through message passing along with the structure.Most empirical studies of GNNs assume that the observed graph structure perfectly depicts the accurate and complete relations between nodes.However,in real-world scenarios,graphs are often noisy,incomplete,or manipulated by adversaries.This noise and interference can affect the quality of graph representations during information aggregation.In this paper,we propose a method called NIB-HGSL,which is a hierarchical graph structure learning method based on the nonlinear information bottleneck principle.NIB-HGSL aims to learn robust graph representations without noisy information,obtaining the minimum sufficient structure for downstream tasks through a balanced optimization of relevant information preservation and noisy information compression.Our comprehensive empirical evaluations demonstrate the effectiveness and robustness of NIB-HGSL,enhancing the power of GNNs for in-the-wild extrapolation.
关 键 词:图表示学习 信息瓶颈 图结构学习 图神经网络 图分类
分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7