检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:万聪 王英[1] WAN Cong;WANG Ying(College of Computer Science and Technology,Jilin University,Changchun 130012,China)
机构地区:[1]吉林大学计算机科学与技术学院,长春130012
出 处:《吉林大学学报(理学版)》2023年第2期331-337,共7页Journal of Jilin University:Science Edition
基 金:国家自然科学基金(批准号:61872161);吉林省科技发展规划项目(批准号:2018101328JC);吉林省发展与改革项目(批准号:2019C053-8).
摘 要:受注意力机制和直推式学习方法的启发,提出一种基于加权元学习的节点分类算法.首先利用欧氏距离计算元学习子任务间数据分布的差异;然后利用子图的邻接矩阵计算捕获子任务间数据点的结构差异;最后将二者转化为权重对元训练阶段更新元学习器过程进行加权,构建优化的元学习模型,解决了经典元学习算法在元训练阶段所有元训练子任务的损失是等权重更新元学习器参数的问题.该算法在数据集Citeseer和Cora上的实验结果优于其他经典算法,证明了该算法在少样本节点分类任务上的有效性.Inspired by attention mechanism and transductive learning method,we proposed a node classification algorithm based on weighted meta-learning.Firstly,Euclidean distance was used to calculate the difference of data distribution between subtasks in meta-learning.Secondly,adjacency matrices of subgraph was used to calculate and capture structural difference of data points between subtasks.Finally,the captured information above between subtasks were converted into weights to weight the process of updating the meta-learner in the meta-training procedure,and an optimized meta-learning model was constructed to solve the problem that the loss of all meta-training subtasks in meta-training procedure of classical meta-learning algorithms was equal-weight to update the parameters of meta-learners.The experimental results of this algorithm on Citeseer and Cora datasets are superior to other classical algorithms,which demonstrates the effectiveness of the algorithm on few-shot node classification task.
分 类 号:TP39[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.15.158.138