检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:白璐 杜承烈[1] Bai Lu;Du Chenglie(School of Computer Science and Engineering,Northwestern Polytechnical University,Xi’an 710072,China)
机构地区:[1]西北工业大学计算机学院
出 处:《计算机测量与控制》2019年第10期259-263,共5页Computer Measurement &Control
摘 要:在受到遮挡物影响的室内环境中,飞行机器人接收数据中常伴有不确定性因素,为了解决复杂室内环境下高精准定位问题,提出了基于Dempster-Shafer的飞行机器人多目标视觉定位方法;根据飞行机器人控制原理,分析飞行位置与期望位置存在偏差,通过提取颜色特征和边缘特征建立多目标模型;设计地面标记,采用迭代算法对标记地面目标进行局部最大化概率计算,以此适应多目标形变,通过Dempster-Shafer证据推理方法获取目标精准位置,由此完成多目标视觉定位;在实验场地支持下,将传统方法与Dempster-Shafer证据推理方法进行对比分析,由结果可知,Dempster-Shafer证据推理方法定位精准度最高可达到96%,对提高室内定位精准度具有一定价值。In the indoor environment affected by the obstruction,the receiving data of the flying robot is often accompanied by uncertain factors.In order to solve the problem of high precision positioning in complex indoor environment,a multi-target visual positioning method based on Dempster-Shafer is proposed.According to the control principle of the flying robot,the deviation between the flight position and the expected position is analyzed,and the multi-objective model is established by extracting the color feature and the edge feature.The ground mark is designed,and the iterative algorithm is used to calculate the local maximization probability of the marked ground target,so as to adapt to the multi-objective deformation,and the target precise position is obtained by the Dempster-Shafer evidence reasoning method,thereby completing the multi-target visual positioning.With the support of the experimental site,the traditional method is compared with the Dempster- Shafer evidence reasoning method.It can be seen that the Dempster-Shafer evidence reasoning method has a positioning accuracy of up to 96%,which has certain value for improving indoor positioning accuracy.
关 键 词:DEMPSTER-SHAFER 飞行机器人 多目标 视觉定位
分 类 号:TP242[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.15