检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张家悦 张灵[1] Zhang Jia-yue;Zhang Ling(School of Computer Science and Technology,Guangdong University of Technology,Guangzhou 510006,China)
机构地区:[1]广东工业大学计算机学院,广东广州510006
出 处:《广东工业大学学报》2023年第4期24-30,36,共8页Journal of Guangdong University of Technology
基 金:广东省交通运输厅科技项目(科技-2016-02-030)。
摘 要:由于样本特征缺乏类别判定性且设备资源不足以支持样本类别结构学习,现有的知识蒸馏方法往往忽略了样本的类别知识蒸馏。针对此问题,本文提出一种增量式类激活知识蒸馏方法(Incremental Class Activation Knowledge Distillation,ICAKD)。首先,利用类激活梯度图提取具备类别判定性的样本特征,并提出类激活图约束损失。然后,构建存储类别判定性特征的增量式记忆库,保存多个训练批次样本并迭代更新。最后,计算记忆库内每一类样本的类质中心,并构造类别结构关系,根据类激活图约束和类别结构关系实现类别知识蒸馏。在Cifar10、Cifar100、Tiny-Image Net、Image Net等数据集上进行对比实验,结果表明本文所提出的方法对比类别结构蒸馏方法(Category Structure Knowledge Distillation,CSKD)在准确率上有0.4%~1.21%的提升,说明了类别判定性特征和增量式方法对类别知识蒸馏起到促进作用。Due to the fact that features are not category-deterministic and the equipment resources are usually limit to support the category structure learning of samples,existing knowledge distillation methods possibly ignore the category knowledge distillation of samples.Therefore,this paper proposes a distillation method based on incremental class activation knowledge(ICAKD).First,this paper uses the class activation gradient map to extract class-discriminative sample features and proposes a class-activation constraint loss.Then,an incremental memory bank is built to store class-deterministic features,and multiple training batch samples are saved and updated iteratively.Finally,our proposed method calculates the quasi-quality center of the samples in the memory bank to construct the category structure relationship,and further performs the category knowledge distillation according to the class-activation constraint and the category structure relationship.Experimental results on the Cifar10,Cifar100,Tiny-ImageNet,and ImageNet datasets show that the proposed method achieves a 0.4%~1.21%improvement in term of accuracy when compared with the Category Structure Knowledge Distillation(CSKD)methods,demonstrating the promising effectiveness of the characteristics and increment of category judgment for category knowledge distillation.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.144.124.142