检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:袁健[1] 李佳慧 YUAN Jian;LI Jia-hui(School of Optical-Electrical and Computer Engineering,University of Shanghai for Science and Technology,Shanghai 200093,China)
机构地区:[1]上海理工大学光电信息与计算机工程学院,上海200093
出 处:《小型微型计算机系统》2023年第5期1035-1042,共8页Journal of Chinese Computer Systems
基 金:国家自然科学基金项目(61775139)资助。
摘 要:针对公共场所监控图像中低分辨率人脸图像利用现有人脸识别系统识别准确率低的问题,提出了融合先验信息的残差空间注意力人脸超分辨率重建模型,用该模型对低分辨率人脸图像进行预处理后再进行识别可大大提升识别准确率.该模型将面部先验结构信息嵌入到生成对抗网络模型中,再采用残差空间注意力激活算法突出空间位置中携带高频信息的特征,最后使用多阶特征融合算法充分利用不同尺度的特征,防止携带高频信息的人脸特征在网络传播中丢失.实验结果表明,重建出的超分辨率人脸图像具有更多的面部细节特征,大大提高了对低分辨率人脸图像的识别准确率,并且与其他5种模型相比,新模型具有较低的耗时和较少的参数.Aiming at the problem that the recognition accuracy of low resolution face images in public place monitoring images is low by using the existing face recognition system,a residual spatial attention face superresolution reconstruction model fusing a priori information is proposed.The recognition accuracy can be greatly improved by preprocessing the low resolution face images with this model.The model embeds the facial prior structure information into the generative adversarial networks model,then uses the residual spatial attention activation algorithm to highlight the high-frequency information features in the spatial location,and finally uses the multi-stage feature fusion algorithm to make full use of the feature maps of different scales to prevent the loss of face features with high-frequency information in the network propagation.The experimental results show that the reconstructed superresolution face image has more facial detail features,greatly improves the recognition accuracy of low resolution face image,and compared with the other five models,the new model takes less time and has fewer parameters.
关 键 词:面部先验信息 残差空间注意力 特征融合 超分辨率重建
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222