检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:焦阳阳 黄润才[1] JIAO Yangyang;HUANG Runcai(School of Electronic and Electrical Engineering,Shanghai University of Engineering Science,Shanghai 201620,China)
机构地区:[1]上海工程技术大学电子电气工程学院,上海201620
出 处:《智能计算机与应用》2023年第5期181-186,共6页Intelligent Computer and Applications
摘 要:针对小样本表情识别无法有效提取表情特征,以及单一特征提取方法提取的信息不够丰富等问题,提出一种融合视觉注意力(VIT)与改进局部图结构特征的人脸表情识别算法。首先对局部图结构进行改进,在计算特征时采样更多的邻域像素,重新优化权重分配机制,对表情图像使用纹理描述符提取局部特征。同时将表情图像送入视觉注意力模型中,通过迁移学习的方法得到全局特征。最后将局部纹理特征与全局特征进行融合,得到融合特征并使用Softmax对表情进行分类。通过在CK+与Oulu-CASIA数据集上进行实验,分别取得了97.4%与87.6%的识别准确率。结果表明,本文方法能准确识别出人脸的基本面部表情,与其他方法相比能得到更高的识别准确率。Aiming at the problem that small sample expression recognition cannot effectively extract expression features and the information extracted by a single feature extraction method is not rich enough,a facial expression recognition algorithm that combines visual attention(VIT)and improved local graph structure features is proposed.Firstly,the local graph structure is improved,more neighborhood pixels are sampled when calculating features,the weight allocation mechanism is re-optimized,and local features are extracted by using texture descriptors for expression images.At the same time,the facial expression images are sent into the visual attention model,and the global features are obtained through the transfer learning method.Finally,local texture features and the global features are fused to obtain fused features and use Softmax to classify expressions.Through experiments on the CK+and Oulu-CASIA datasets,the recognition accuracy rates of 97.4%and 87.6%were obtained,respectively.The results show that the method in this paper can accurately identify the basic facial expressions of human faces,and can obtain higher recognition accuracy compared with other methods.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.117