检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王俊峰 木特力甫·马木提 阿力木江·艾沙[1,3] 努尔毕亚·亚地卡尔[1,3] 库尔班·吾布力 WANG Jun-feng;Mutallip·Mamut;Alimjan·Aysa;Nurbiya·Yadikar;Kurban·Ubul(College of Information Science and Engineering,Xinjiang University,Urumqi 830046,China;The Library of Xinjiang University,Urumqi 830046,China;Key Laboratory of Xinjiang Multilingual Information Technology,Xinjiang University,Urumqi 830046,China)
机构地区:[1]新疆大学信息科学与工程学院,新疆乌鲁木齐830046 [2]新疆大学图书馆,新疆乌鲁木齐830046 [3]新疆大学新疆多语种信息技术重点实验室,新疆乌鲁木齐830046
出 处:《计算机工程与设计》2023年第2期473-479,共7页Computer Engineering and Design
基 金:国家自然科学基金项目(61862061、61563052、61363064)。
摘 要:为有效提取和融合表情多粒度特征信息,降低自然场景人脸表情数据集存在不确定性和错误数据等因素致使准确率难以满足现实需求的问题,基于深度卷积神经网络提出多粒度与自修复融合的表情识别模型。采用拼图生成器生成不同粒度图像,利用渐进式的训练过程学习不同粒度图像之间互补的特征信息,采用自修复方法避免网络过度拟合错误样本图像,对错误样本进行重新标注。在AffectNet数据集和RAF-DB数据集上准确率分别达到了63.94%和87.10%,实验结果表明,该模型具有较高的准确率和良好的鲁棒性。To effectively extract and fuse the multi-granularity feature information of expression and to reduce the problem that the accuracy is difficult to meet the practical needs due to the uncertainty and wrong data in the natural scene facial expression data set, a multi granularity and self-repair fusion expression recognition model was proposed based on deep convolution neural network. The puzzle generator was used to generate images of different granularities. The progressive training process was used to learn the complementary feature information between images of different granularities. The self-repair method was used to avoid the network from overfitting the wrong sample images. The wrong samples were relabeled. On the AffectNet dataset and RAF-DB dataset, the accuracies reach 63.94% and 87.10%, respectively. Experimental results show that the model has high accuracy and good robustness.
关 键 词:多粒度 渐进式训练 自修复 拼图生成器 表情识别
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.38