检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:徐岸 吴永明[1,2] 郑洋 Xu An;Wu Yongming;Zheng Yang(State Key Laboratory of Public Big Data,Guizhou University,Guiyang 550025;Key Laboratory of Advanced Manufacturing Technology of Ministry of Education,Guizhou University,Guiyang 550025)
机构地区:[1]贵州大学公共大数据国家重点实验室,贵阳550025 [2]贵州大学现代制造教育部重点实验室,贵阳550025
出 处:《计算机辅助设计与图形学学报》2024年第5期775-785,共11页Journal of Computer-Aided Design & Computer Graphics
基 金:国家自然科学基金(51505094);贵州省科学技术基金计划(zk[2023]一般079);贵州财经大学引进人才科研启动项目(2023YJ17).
摘 要:针对神经网络模型在增量学习中存在灾难性遗忘问题,提出一种基于自监督与隐层蒸馏约束的正则化类增量学习方法,包括自监督伪标签预测、隐层蒸馏约束和参数正则化.首先基于贝叶斯和信息论提出一种对模型参数重要性评价的正则化策略;然后利用自监督伪标签预测增强模型的表征能力,并保留隐层特征,通过加入高斯噪声提高隐层特征的泛化能力;最后使用蒸馏约束方法与交叉熵分类损失对历史任务的隐层特征与输出层特征进行训练.在CIFAR-10和CIFAR-100数据集上的实验结果表明,所提方法取得较好的效果,其中,在CIFAR-100数据集上的平均准确率和遗忘率分别达到64.16%和15.95%;该方法能够有效地减少灾难性遗忘的影响.Aiming at the problem of catastrophic forgetting in incremental learning of neural network mod-els,a regularized class of incremental learning method based on self-supervision with hidden layer distilla-tion constraints is proposed,including pseudo label prediction,knowledge distillation and parameter regu-larization.First,a regularization constraint method based on Bayesian and information theory is proposed for the importance evaluation of model parameters,and then the characterization ability of the model is en-hanced by using self-supervised pseudo label prediction,and the hidden layer features are preserved by adding Gaussian noise to improve the generalization ability of the features.The hidden layer features and output layer features of the historical task are trained using a distillation constraint method with cross-entropy classification loss.The experimental results show that better results are achieved on the CIFAR-10 and CIFAR-100 datasets,where the average accuracy and forgetting rates reach 64.16%and 15.95%,respectively,on the CIFAR-100 dataset.The proposed method is effective in reducing the effects of catastrophic forgetting.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.118