检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:梁兴柱[1,2] 徐慧 胡干 LIANG Xingzhu;XU Hui;HU Gan(School of Computer Science and Engineering,Anhui University of Science and Technology,Huainan Anhui 232001;Institute of Environment-friendly Materials and Occupational Health(Wuhu)Anhui University of Science and Technology,Wuhu Anhui 241003)
机构地区:[1]安徽理工大学计算机科学与工程学院,安徽淮南232001 [2]安徽理工大学环境友好材料与职业健康研究院(芜湖),安徽芜湖241003
出 处:《湖北理工学院学报》2023年第1期31-35,共5页Journal of Hubei Polytechnic University
基 金:芜湖市科技计划项目(项目编号:2020yf48);安徽理工大学环境友好材料与职业健康研究院研发专项基金资助项目(项目编号:ALW2021YF04)。
摘 要:为构建一个强大的在线集成教师指导各子网络学习的在线知识蒸馏模型,提升模型准确率,提出了一种结合注意力与特征融合的在线知识蒸馏方法(KD-ATFF),在各子网络输出处利用特征融合模块融合各分支最后一个block学习到的知识,从而构建强大的教师模型,指导各分支训练,同时利用所提的CL模块将深层神经元的注意力转移到浅层网络进行互学习,以增加各block的多样性,进一步提升单个子网络的性能。在CIFAR10/100数据集上进行实验,KD-ATFF与baseline方法相比错误率降低了约30%,于DML相比错误率最大降低了1.76%,验证了算法的有效性。In order to construct a powerful online knowledge distillation model with an online integrated teacher guiding the learning of each sub-network and improve the model accuracy,this paper proposes an online knowledge distillation method combining attention with feature fusion(KD-ATFF),which uses a feature fusion module at the output of each sub-network to fuse the knowledge learned in the last block of each branch,thus constructing a powerful teacher model to guide the training of each branch.The proposed CL module is also used to transfer the attention of deep neurons to shallow networks for mutual learning in order to increase the diversity of each block and further improve the performance of individual sub-networks.Experiments are conducted on the CIFAR10/100 dataset,and the error rate of KD-ATFF is reduced by about 30%compared to the baseline method,and the maximum error rate is reduced by 1.76%compared to DML,which verifies the effectiveness of the algorithm.
分 类 号:TP399[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:13.59.48.34