检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:程凤伟 王文剑 史颖[4] 张珍珍[1] Cheng Fengwei;Wang Wenjian;Shi Ying;Zhang Zhenzhen(Department of Computer Science and Technology,Taiyuan University,Taiyuan,030032,China;Institute of Intelligent Information Processing,Shanxi University,Taiyuan,030006,China;Department of Network Security,Shanxi Police College,Taiyuan,030401,China;College of Computer Science and Technology,Taiyuan Normal University,Taiyuan,030619,China)
机构地区:[1]太原学院计算机科学与技术系,太原030032 [2]山西大学智能信息处理研究所,太原030006 [3]山西警察学院网络安全保卫系,太原030401 [4]太原师范学院计算机科学与技术学院,太原030619
出 处:《南京大学学报(自然科学版)》2024年第5期785-792,共8页Journal of Nanjing University(Natural Science)
基 金:国家自然科学基金(U21A20513,62076154);山西省重点研发计划(202202020101003);山西省高等学校科技创新项目(2024L382)
摘 要:图神经网络(Graph Neural Networks,GNNs)在节点分类任务中取得了显著的成功,然而,目前的GNNs模型倾向于处理具有大量标记数据的多数类,较少关注标记较少的少数类,传统方法常通过过采样来解决这一问题,但可能会导致过拟合.近期的一些研究提出了从标记节点合成少数类附加节点的方法,但对于生成的节点是否真正代表相应的少数类,没有明确保证,实际上,不正确的合成节点可能导致算法的泛化能力不足.为了解决这一问题,提出一种基于对抗训练的简单自监督数据增强方法 GraphA2,通过在少数类周围的平滑空间中对梯度最远的地方施加扰动来增强数据,同时采用对比学习来保证增强后的一致性.使用这种方法,不仅增强了数据的多样性,还确保了模型在整个空间中的平滑性和连贯性,能增强其泛化能力.实验表明,提出的方法在各种类别不平衡的数据集上的性能均优于目前最先进的基准模型.Graph Neural Networks(GNNs)have achieved notable success in node classification tasks.However,current GNN models tend to focus on majority classes with a large amount of labeled data,paying little attention to minority classes with fewer labels.Traditional methods often address this issue through oversampling,which may lead to overfitting.Some recent studies suggest synthesizing additional nodes for minority classes from labeled nodes,yet there's no clear guarantee that these generated nodes truly represent the corresponding minority classes.In fact,incorrect synthetic nodes may undermine the generalization ability of the algorithm.To address this issue,this paper introduces a simple,self-supervised data augmentation method based on adversarial training,GraphA2,which enhances the data by adding perturbations at the farthest gradient space around minority classes while using contrastive learning to ensure consistency after augmentation.This approach not only increases the diversity of data but also ensures smoothness and coherence across the entire space,thereby enhancing generalization capability.The experiments show that this method outperforms the current state-of-the-art baseline models on various imbalanced datasets.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15