检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:余慧瑾 方勇纯 韦知辛 YU Huijin;FANG Yongchun;WEI Zhixin(Institute of Robotics and Automatic Information System,College of Artificial Intelligence,Nankai University,Tianjin 300071,China;Tianjin Key Laboratory of Intelligent Robotics,Nankai University,Tianjin 300071,China)
机构地区:[1]南开大学人工智能学院机器人与信息自动化研究所,天津300071 [2]天津市智能机器人技术重点实验室,天津300071
出 处:《机器人》2021年第6期706-714,共9页Robot
基 金:国家重点研发计划(2018YFB1309000).
摘 要:现有的场景识别方法准确率低,适应能力不强.为此,将自主发育神经网络应用于机器人场景识别任务,提出了2种将自主发育网络与多传感器融合技术相结合的场景识别方法,即基于加权贝叶斯融合的机器人场景识别方法,以及基于同一自主发育网络架构数据融合的场景识别方法,分别在决策层以及数据层对多传感器信息进行融合,提高了场景识别的准确度,而自主发育网络则提升了识别方法针对各种复杂场景的适应能力.对于所提出的场景识别方法进行了实验测试与分析,证实了其有效性及实用性.此外,由于在同一网络架构下进行数据融合可更高效地利用数据,因此这种方法在场景识别的准确度方面具有更为优越的性能.Considering the low accuracy and poor adaptability of the existing scene recognition methods,the autonomous developmental neural network is applied to the robot scene recognition task,and two scene recognition methods combining the autonomous developmental network and multi-sensor fusion are proposed,namely,the robot scene recognition method based on weighted Bayesian fusion,and the scene recognition method based on data fusion of the same autonomous developmental network architecture,where the multi-sensor information is fused in the decision-making layer and the data layer,respectively,so as to improve the accuracy of scene recognition.Meanwhile,the autonomous developmental network improves the adaptability of the recognition method for various complex scenes.The proposed scene recognition method is tested and analyzed,which proves its effectiveness and practicability.In addition,the proposed method achieves better accuracy in scene recognition due to more efficient use of collected data through data fusion in the same network architecture.
分 类 号:TP24[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.38