检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陈征 李晋江[1] CHEN Zheng;LI Jin-jiang(School of Computer Science and Technology,Shandong Technology and Business University,Yantai 264000,China)
机构地区:[1]山东工商学院计算机科学与技术学院,山东烟台264000
出 处:《计算机工程与设计》2024年第10期3059-3065,共7页Computer Engineering and Design
基 金:国家自然科学基金项目(62002200、62202268、61972235)。
摘 要:由于RGB图像的深度歧义性,关节点的深度坐标相对于关节点的二维图像坐标来说更难预测。提出一种基于手部多尺度特征融合的双分支手部姿态估计算法,将手部关节点的二维图像坐标和深度坐标进行分组预测。采用FPN提取手部多尺度特征,提出特征融合模块,对手部多尺度特征进行融合增强,得到手部高层特征和低层特征;提出双分支网络结构,利用融合之后的手部高层特征和低层特征分别预测手部关节点的深度坐标和二维图像坐标。在两个公开的手势数据集上进行了充分实验,与当前最好方法相比,所提方法在平均关节误差指标上取得了当前最好结果。Due to the depth ambiguity of RGB images,the hand joint depth coordinates are usually more difficult to estimate compared to the hand joint 2D image coordinates.A dual branch hand pose estimation algorithm based on multi-scale feature fusion was proposed.The FPN was used to extract multi-scale features of the hand.A feature fusion module was proposed to fuse and enhance the hand features,obtaining high-level and low-level features of the hand.A dual branch network structure was proposed,in which the high-level features and low-level features were used to estimate the depth coordinates and two-dimensional image coordinates of the hand joints,respectively.Sufficient experiments were conducted on two publicly available hand pose datasets.The proposed method achieves the best results in terms of the mean joint error metric compared with state-of-the-art methods.
关 键 词:手部姿态估计 多尺度特征融合 特征提取 平均关节误差 人机交互 分组预测 双分支网络
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:18.119.100.196