基于图表征知识蒸馏的图像分类方法  

Graph-Based Representation Knowledge Distillation for Image Classification

在线阅读下载全文

作  者:杨传广 陈路明 赵二虎[1] 安竹林[1] 徐勇军[1] YANG Chuan-guang;CHEN Lu-ming;ZHAO Er-hu;AN Zhu-lin;XU Yong-jun(Institute of Computing Technology,Chinese Academy of Sciences,Beijing 100190,China;Unit 93114 of PLA,Beijing 100080,China)

机构地区:[1]中国科学院计算技术研究所,北京100190 [2]93114部队,北京100080

出  处:《电子学报》2024年第10期3435-3447,共13页Acta Electronica Sinica

基  金:国家自然科学基金(No.62072434);北京市自然科学基金(No.4212027)。

摘  要:知识蒸馏的核心思想是利用1个作为教师网络的大型模型来指导1个作为学生网络的小型模型,提升学生网络在图像分类任务上的性能.现有知识蒸馏方法通常从单一的输入样本中提取类别概率或特征信息作为知识,并没有对样本间关系进行建模,造成网络的表征学习能力下降.为解决此问题,本文引入图卷积神经网络,将输入样本集视为图结点构建关系图,图中的每个样本都可以聚合其他样本信息,提升样本的表征能力.本文从图结点和图关系2个角度构建图表征知识蒸馏误差,利用元学习引导学生网络自适应学习教师网络更佳的图表征,提升学生网络的图建模能力.相比于基线方法,本文提出的图表征知识蒸馏方法在加拿大高等研究院(Canadian Institute For Advanced Research,CIFAR)发布的100种分类数据集上提升了3.70%的分类准确率,表明本文方法引导学生网络学习到了更具有判别性的特征空间,提升了图像分类能力.The core idea of knowledge distillation is to use a large model as the teacher network to guide a small model as the student network,improving the performance of the student network in image classification tasks.Existing knowledge distillation methods often extract category probability or feature information as knowledge from a single input sample.They could not model the relationships between samples,decreasing the network’s representation learning ability.To solve this problem,this paper introduces a graph convolutional neural network,which treats the input sample set as graph nodes to construct a relationship graph.Each sample in the graph could aggregate information from other samples,improving its own representation ability.This paper constructs the distillation loss of graph representation knowledge from the perspectives of graph nodes and relationships.It uses meta-learning to guide the student network to adaptively learn better graph representations from a teacher network,thereby improving the graph modeling ability of the student network.Compared to the baseline method,the graph-based representation knowledge distillation method improves the classification accuracy by 3.70%on the 100-classification dataset published by Canadian Institute For Advanced Research.The result indicates that the proposed method makes the student network learn a more discriminative feature space,thereby improving its image classification ability.

关 键 词:知识蒸馏 图卷积神经网络 图像分类 元学习 表征学习 

分 类 号:TP391.42[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象