检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:苏鸿 马超[1,2] 苏鹏 高经纬 Su Hong;Ma Chao;Su Peng;Gao Jingwei(Key Laboratory of Modern Measurement and Control Technology of the Ministry of Education,Beijing Information Science and Technology University,Beijing 100192,China;Beijing Key Laboratory of Electromechanical System Measurement,Beijing Information Science and Technology University,Beijing 100192,China)
机构地区:[1]北京信息科技大学现代测控技术教育部重点实验室,北京100192 [2]北京信息科技大学机电系统测控北京市重点实验室,北京100192
出 处:《电子测量与仪器学报》2023年第3期95-101,共7页Journal of Electronic Measurement and Instrumentation
基 金:国家自然科学基金(52005045)项目资助。
摘 要:针对下肢外骨骼应用中的难点问题,开展了基于XGBoost算法,利用单个IMU采集的运动姿态数据对步态相位进行识别的研究。首先,采集了6种不同步态下的足部运动数据,然后将每种步态划分为4个相位;在此基础上,以足部运动数据作为训练集,然后应用XGBoost算法进行步态相位识别的分析。建立模型的过程中通过贝叶斯优化算法进一步对模型中涉及的参数进行优化。计算显示,模型的测试集平均正确率为89.26%,精度为89.64%,召回率为89.26%,F1值为89.10%;结果分析表明该模型能够实现较好的步态相位识别。To address the problems in the application of lower limb exoskeleton mechanical equipment,XGBOOT Algorithm-based research on gait phase recognition is carried out,only using motion attitude data measured by a single IMU.Firstly,foot motion data of six different gaits are collected,and each gait is divided into four phases.On this basis,XGBOOT algorithm optimized is applied to analyze the gait phase recognition with the foot motion data as the training set.In the process of establishing the model,the parameters involved in the model are further optimized by the Bayesian optimization algorithm(BOA).Through calculation,the results show that the average accuracy of the model is 89.26%in the verification set,the precision of the model is 89.64%in the verification set,the recall rate of the model is 89.26%in the verification set,F1 value of the model is 89.10%in the verification set,which indicates that the model can achieve better gait phase recognition.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.30