检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:周诚辰 于千城[1,2] 张丽丝 胡智勇 赵明智 ZHOU Chengchen;YU Qiancheng;ZHANG Lisi;HU Zhiyong;ZHAO Mingzhi(School of Computing Science and Engineering,North Minzu University,Yinchuan 750021,China;The Key Laboratory of Images&Graphics Intelligent Processing of State Ethnic Affairs Commission,Yinchuan 750021,China)
机构地区:[1]北方民族大学计算机科学与工程学院,银川750021 [2]图形图像国家民委重点实验室,银川750021
出 处:《计算机工程与应用》2024年第14期37-49,共13页Computer Engineering and Applications
基 金:宁夏重点研发计划(引才专项)项目(2022YCZX0013);宁夏重点研发计划(重点)项目(2023BDE02001);北方民族大学2022年校级科研平台《数字化农业赋能宁夏乡村振兴创新团队》(2022PT_S10);银川市校企联合创新项目(2022XQZD009)。
摘 要:随着图结构数据在各种实际场景中的广泛应用,对其进行有效建模和处理的需求日益增加。Graph Transformers(GTs)作为一类使用Transformers处理图数据的模型,能够有效缓解传统图神经网络(GNN)中存在的过平滑和过挤压等问题,因此可以学习到更好的特征表示。根据对近年来GTs相关文献的研究,将现有的模型架构分为两类:第一类通过绝对编码和相对编码向Transformers中加入图的位置和结构信息,以增强Transformers对图结构数据的理解和处理能力;第二类根据不同的方式(串行、交替、并行)将GNN与Transformers进行结合,以充分利用两者的优势。介绍了GTs在信息安全、药物发现和知识图谱等领域的应用,对比总结了不同用途的模型及其优缺点。最后,从可扩展性、复杂图、更好的结合方式等方面分析了GTs未来研究面临的挑战。With the widespread application of graph structured data in various practical scenarios,the demand for effective modeling and processing is increasing.Graph Transformers(GTs),as a type of model that uses Transformers to process graph data,can effectively alleviate the problems of over smoothing and over squeezing in traditional graph neural network(GNN),and thus can learn better feature representations.Firstly,based on the research on recent GTs related literature,the existing model architectures are divided into two categories:the first category adds graph position and structure information to Transformers through absolute encoding and relative encoding to enhance Transformers’understanding and processing ability of graph structure data;the second type combines GNN with Transformers in different ways(serial,alternating,parallel)to fully utilize their advantages.Secondly,the application of GTs in fields such as information security,drug discovery,and knowledge graphs is introduced,and the advantages and disadvantages of models with different uses are compared and summarized.Finally,the challenges faced by future research on GTs are analyzed from aspects such as scalability,complex graphs,and better integration methods.
关 键 词:Graph Transformers(GTs) 图神经网络 图表示学习 异构图
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7