检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:宦晓辉 邢凯[1,2] 马鲁恒 HUAN Xiaohui;XING Kai;MA Luheng(School of Computer Science and Technology,University of Science and Technology of China,Anhui 230027,China;Suzhou Research Institute,University of Science and Technology of China,Jiangsu 215123,China)
机构地区:[1]中国科学技术大学计算机科学与技术学院,安徽230027 [2]中国科学技术大学苏州高等研究院,江苏215123
出 处:《电子技术(上海)》2024年第1期46-52,共7页Electronic Technology
摘 要:阐述基于多目视觉的动作空间和动作序列时间上的联合约束,采用列文伯格-马夸尔特方法进行多坐标系融合计算,将可见光视觉下无标记动作捕捉方法的姿态识别精度从厘米级别提升到毫米级别。同时针对连续动作姿态序列,提出基于空间分层结构和多时间尺度特征的时空金字塔网络建模方法,以及适用于多类套动作的套动作质量评估深度学习方法,并在KIMORE和UI-PRMD数据集上取得全面优于现有康复评估方法的效果。This paper describes the unmarked motion capture technology based on multi-eye vision,greatly improves the accuracy of 3D reconstruction of human motion and continuous attitude measurement and quantitative analysis,and proposes a set of motion quality evaluation method based on spatio-temporal pyramid network model.Specifically,this paper presents for the first time the joint constraints of action space and action sequence time based on multi-eye vision,uses the LevenbergMarquardt method for multi-coordinate fusion calculations,and uses the unmarked motion capture method under visible light vision.The accuracy of gesture recognition has been improved from the centimeter level to the millimeter level,at the same time,for continuous action gesture sequences,a space-time pyramid network modeling method based on spatial hierarchical structure and multi-time scale features and a set of action quality evaluation suitable for multi-type sets of actions are proposed.Deep learning method,and achieved overall better results than existing rehabilitation evaluation methods on KIMORE and UI-PRMD datasets.
关 键 词:计算机工程 运动质量评估 人体动作捕捉 三维重建 多目视觉 康复评估
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程] TP311.13[自动化与计算机技术—控制科学与工程] TP391.1
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.135.220.9