检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:任好盼 王文明[1] 危德健 高彦彦 康智慧 王全玉[1] REN Hao-pan;WANG Wen-ming;WEI De-jian;GAO Yan-yan;KANG Zhi-hui;WANG Quan-yu(School of Computer Science and Technology,Beijing Institute of Technology,Beijing 100081,China)
出 处:《图学学报》2021年第3期432-438,共7页Journal of Graphics
摘 要:人体姿态估计在人机交互和行为识别应用中起着至关重要的作用,但人体姿态估计方法在特征图尺度变化中难以预测正确的人体姿态。为了提高姿态估计的准确性,将并行网络多尺度融合方法和生成高质量特征图的方法结合进行人体姿态估计(RefinedHRNet)。在人体检测基础之上,采用并行网络多尺度融合方法在阶段内采用空洞卷积模块来扩大感受野,以保持上下文信息;在阶段之间采用反卷积模块和上采样模块生成高质量的特征图;然后并行子网络最高分辨率的特征图(输入图像尺寸的1/4)用于姿态估计;最后采用目标关键点相似度OKS来评价关键点识别的准确性。在COCO2017测试集上进行实验,该方法比HRNet网络模型姿态估计的准确度提高了0.4%。Human pose estimation plays a vital role in human-computer interaction and behavior recognition applications,but the changing scale of feature maps poses a challenge to the relevant methods in predicting the correct human poses.In order to heighten the accuracy of pose estimation,the method for the parallel network multi-scale fusion and that for generating high-quality feature maps were combined for human pose estimation.On the basis of human detection,RefinedHRNet adopted the method for parallel network multi-scale fusion to expand the receptive field in the stage using a dilated convolution module to maintain context information.In addition,RefinedHRNet employed a deconvolution module and an up-sampling module between stages to generate high-quality feature maps.Then,the parallel network feature maps with the highest resolution(1/4 of the input image size)were utilized for pose estimation.Finally,Object Keypoint Similarity(OKS)was used to evaluate the accuracy of keypoint recognition.Experimenting on the COCO2017 test set,the pose estimation accuracy of our proposed method RefinedHRNet is 0.4%higher than the HRNet network model.
关 键 词:姿态估计 多尺度融合 高质量特征图 人体检测 关键点相似度
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.3