T-SNet:基于知识蒸馏的轻量化面部表情识别  

T-SNet:Lightweight facial expression recognition based on knowledge distillation

在线阅读下载全文

作  者:赵佳辉 冯晓祥 曹文辉 裴涛 诸佳炜 王瑞[1] ZHAO Jiahui;FENG Xiaoxiang;CAO Wenhui;PEI Tao;ZHU Jiawei;WANG Rui(College of Communication and Information Engineering,Shanghai University,Shanghai 200444,China)

机构地区:[1]上海大学通信与信息工程学院,上海200444

出  处:《微电子学与计算机》2025年第4期38-47,共10页Microelectronics & Computer

基  金:国家自然科学基金(61771299)。

摘  要:针对复杂神经网络无法在存储空间和计算资源有限的智能终端部署的问题,提出了一种基于知识蒸馏的轻量化面部表情识别模型T-SNet(Teacher-Student Net)。教师模型选用基于细粒度特征提取模块改进后的ResNet18,在MS-Celeb-1M人脸数据集进行预训练。学生模型选用轻量化卷积神经网络ShuffleNetV2,通过优化蒸馏损失函数提高轻量化模型准确性。首先,在面部表情数据集上对教师模型进行蒸馏,提取丰富的特征信息反馈至学生模型,然后综合利用教师模型和学生模型的特征信息训练学生模型,最终训练完成的学生模型作为轻量化面部表情识别模型。T-SNet在FER2013Plus和RAF-DB面部表情数据集,分别取得了88.80%和89.11%的准确率,参数量仅1.20 MB,在准确度和模型复杂度方面均优于其他主流模型。Aiming at the problem that complex neural networks cannot be deployed in intelligent terminals with limited storage space and computational resources,a lightweight facial expression recognition model T-SNet(Teacher-Student Net)based on knowledge distillation is proposed.ResNet18,which is improved based on the fine-grained feature extraction module,is selected for teacher model pre-training on MS-Celeb-1M face dataset.ShuffleNetV2 is selected for the student model to improve the accuracy of the lightweight model by optimizing the distillation loss function.Firstly,the teacher model is distilled on the facial expression dataset,and rich feature information is extracted and fed back to the student model.Then,the feature information of the teacher model and the student model is comprehensively used to train the student model.Finally,the trained student model is used as a lightweight facial expression recognition model.T-SNet achieves 88.80%and 89.11%accuracy in FER2013Plus and RAF-DB facial expression datasets,respectively,with only 1.20 MB parameters,which is superior to other mainstream models in terms of accuracy and model complexity.

关 键 词:计算机视觉 面部表情识别 知识蒸馏 轻量化 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象