检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:吉书仪 魏宇轩 戴琼海[2,3,4,5] 高跃 Shuyi JI;Yuxuan WEI;Qionghai DAI;Yue GAO(School of Software,Tsinghua University,Beijing 100084,China;Department of Automation,Tsinghua University,Beijing 100084,China;Beijing Laboratory of Brain and Cognitive Intelligence,Beijing 100084,China;Beijing National Research Center for Information Science and Technology,Beijing 100084,China;Institute for Brain and Cognitive Science,Tsinghua University,Beijing 100084,China)
机构地区:[1]清华大学软件学院,北京100084 [2]清华大学自动化系,北京100084 [3]脑与认知智能北京实验室,北京100084 [4]北京信息科学与技术国家研究中心,北京100084 [5]清华大学脑与认知科学研究院,北京100084
出 处:《中国科学:信息科学》2024年第4期853-871,共19页Scientia Sinica(Informationis)
基 金:国家自然科学基金(批准号:62021002,62088102);清华大学自主科研计划(批准号:20227020007);北京市自然科学基金(批准号:4222025);之江实验室开放课题(批准号:2021KG0AB05)资助项目。
摘 要:高阶关联广泛存在于现实世界中,如社交网络、生物网络、交通网络等,建模及优化高阶关联对于网络属性研究和演化趋势预测具有重要意义.超图是一种灵活的数据结构,能够自然地建模高阶关联.近年来,随着深度学习的发展,基于超图建模的超图神经网络被广泛应用于面向高阶关联的表示学习.然而,现有的超图神经网络均基于直推学习范式,虽然在小规模超图数据集上取得了不错的效果,但难以应用到大规模数据上,限制了其应用范围.本文首先分析了现有超图神经网络方法在大规模数据上应用的挑战,然后针对该问题提出了面向大规模数据的高效超图神经网络方法(efficient hypergraph neural network,EHGNN).针对现有方法空间、时间复杂度过高的问题,EHGNN分别设计了超图采样模块和基于单阶段超图卷积的计算加速模块,同时降低了超图神经网络的空间开销和时间开销,使得超图神经网络适用于大规模超图数据,显著增强了可扩展性.在4个真实超图数据集上的实验结果验证了EHGNN的有效性和高效性.High-order correlations are ubiquitous in the real world,such as the social network,the biological network,and the transportation network.It is of significant importance to model and optimize high-order correlations for network investigation.The hypergraph,as a flexible and scalable structure,can be applied to model the high-order correlations in a natural manner.With the development of deep learning,hypergraph neural networks(HGNNs)are widely leveraged for high-order correlation modeling and optimization.Although existing HGNNs have shown decent performance on small-scale datasets,they cannot be applied to large-scale data due to their expensive space cost caused by the transductive learning paradigm in that case.This paper first analyzes the root causes of the deficiency that HGNNs are unable to handle large-scale data.Furthermore,this paper presents the efficient hypergraph neural network(EHGNN)towards the million-level data.EHGNN designs the hypergraph sampling module and the computational acceleration module that is based on single-stage hypergraph convolution,reducing the time and space cost of HGNNs.Experimental results on four real-world hypergraph datasets demonstrate the effectiveness and efficiency of the proposed EHGNN.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222