检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:丁俊涵 崔玉霞[1] 王天宇 王宪伦[1,2] DING Junhan;CUI Yuxia;WANG Tianyu;WANG Xianlun(College of Mechanical and Electronic Engineering,Qingdao University of Science and Technology,Qingdao 266061,China;Qingdao Anjie Medical Technology Co Ltd,Qingdao 266100,China)
机构地区:[1]青岛科技大学机电工程学院,山东青岛266061 [2]青岛安捷医疗科技有限公司,山东青岛266100
出 处:《传感器与微系统》2024年第6期42-45,共4页Transducer and Microsystem Technologies
基 金:国家自然科学基金资助项目(51105213)。
摘 要:针对人机协作系统中的误识别及碰撞检测问题,对人体关节信息提取和建模方法进行研究。设计了一种基于关节间距离的粗大误差剔除策略;在分析了手部运动轨迹规律后,提出一种基于历史轨迹的关节位置跟踪预测方法,通过差分法求导速度的方式,对下一时间节点位置进行预测,在相机采集数据出现异常时,进行校正补充。基于方向包围盒(OBB)思想,建立了一种人机碰撞检测模型,找到了一种机械臂包络体与障碍物实际最小距离的计算方法,避免了系统的错误判断和多余避碰动作的产生;对所述方法进行了碰撞检测仿真验证。结果表明:所提出的方法稳定可靠,可以有效避免人机协作中碰撞的发生。Aiming at the problem of misidentification and collision detection in human-robot collaboration(HRC)system,extraction and modeling methods of human joint information are studied.A coarse error elimination strategy based on distance between joints is designed.After analyzing the law of hand motion trajectory,a joint position tracking and prediction method based on historical trajectories is proposed,by using differential method to derive the velocity so as to predict the position of the next time node,and corrects and supplements when the data collected by camera is abnormal.Based on the idea of the oriented bounding box(OBB),a human-machine collision detection model is established,which finds a calculation method for the actual minimum distance between the envelope of the manipulator and the obstacle,avoids the wrong judgment of the system and the generation of redundant collision avoidance actions.Collision detection simulation is carried out for the method.The results show that the proposed method is stable and reliable,and can effectively avoid collisions in HRC.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222