一种自蒸馏的轻量化图像分类网络方案  

A Self-distillation Lightweight Image Classification Network Scheme

在线阅读下载全文

作  者:倪水平[1] 马新良 NI Shuiping;MA Xiniang(School of Computer Science and Technology,Henan Polytechnic University,Jiaozuo 454000,China)

机构地区:[1]河南理工大学计算机科学与技术学院,焦作454000

出  处:《北京邮电大学学报》2023年第6期66-71,共6页Journal of Beijing University of Posts and Telecommunications

基  金:国家自然科学基金项目(61872126)。

摘  要:图像分类任务经常通过压缩神经网络模型以减少参数量,导致分类准确率下降。对此,提出了一种自蒸馏的轻量化图像分类网络方案。首先,在自蒸馏框架内引入计算量和参数量可忽略的轻量注意力模块,减少自蒸馏框架的参数量与计算量,从而实现自蒸馏框架轻量化;然后,采用分组卷积与深度可分离卷积对残差网络和VGG11网络进行模型压缩,再把压缩后的2个神经网络作为教师模型,根据教师模型的深度,构建多个作为学生模型的浅层分类器,搭建轻量自蒸馏框架。实验结果表明,所提方案不仅确保原自蒸馏的效果,压缩后的图像分类网络在不低于原分类准确率的基础上,参数量极大减少,并降低模型部署的难度。Image classification tasks often compress neural network models to reduce the number of parameters,which will lead to a decrease in classification accuracy.In order to solve this problem,a self-distillation lightweight image classification network scheme is proposed.First,a lightweight attention module with negligible calculation and parameter amounts is introduced into the self-distillation framework to reduce the parameter amount and calculation amount of the self-distillation framework,thereby achieving a lightweight self-distillation framework.Then,group convolution and depthwise separable convolution are used to compress the residual network and VGG11 network.Next,two compressed neural networks are adopted as teacher model.According to depth of the teacher model,multiple shallow classifiers as student models are constructed.Finally,a lightweight self-distillation framework is built.Experimental results show that the proposed scheme not only ensures original self-distillation effect,but also greatly reduces the number of parameters of the compressed image classification network without being lower than the accuracy of the original classification,and reduces the difficulty of model deployment.

关 键 词:图像分类 神经网络 模型压缩 自蒸馏 

分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象