基于知识蒸馏的特征精炼相互学习方法  被引量:1

Feature refining mutual learning method based on knowledge distillation

在线阅读下载全文

作  者:文强 郭涛[1] 王涛 李贵洋[1] 邹俊颖 WEN Qiang;GUO Tao;WANG Tao;LI Gui-yang;ZOU Jun-ying(School of Computer Science,Sichuan Normal University,Chengdu 610101,China)

机构地区:[1]四川师范大学计算机科学学院,四川成都610101

出  处:《计算机工程与设计》2023年第9期2700-2706,共7页Computer Engineering and Design

基  金:国家自然科学青年基金项目(11905153)。

摘  要:为解决相互学习中两个小型学生网络在训练过程出现图像局部信息丢失,造成单个学生网络接收精炼特征图不完整的问题,提出基于知识蒸馏的特征精炼相互学习方法。通过对每个学生网络配备辅助网络,提供精炼特征映射和软标签,保留特征的局部信息,为学生网络之间相互学习传递精炼提取知识,提高视觉任务适用性。实验结果表明,在细粒度视觉分类任务公开数据集和基准数据集下的准确率与深度相互学习和自我知识蒸馏方法相比均有明显提升。To deal with the problem that two small student networks lose local information of image during the training process,which results in a single student network receiving incomplete refined feature map,a feature refining mutual learning method based on knowledge distillation was proposed.Each student network was equipped with an auxiliary network to provide refined feature mapping and soft label,retain the local information of features,transfer refined extracted knowledge for mutual learning between student networks,and improve the applicability of visual tasks.Experimental results show that,compared with the deep mutual learning and self-knowledge distillation methods,the accuracy under public datasets for fine-grained visual categorization task and benchmark datasets is significantly improved.

关 键 词:知识蒸馏 相互学习 特征精炼 学生网络 辅助网络 自我蒸馏 视觉任务 

分 类 号:TP398.1[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象