检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陈楠[1,2] 胡颖[1,2] 张俊[1,2] 夏泽洋[1,2]
机构地区:[1]中国科学院深圳先进技术研究院,深圳518055 [2]香港中文大学,香港999077
出 处:《集成技术》2013年第2期1-7,共7页Journal of Integration Technology
基 金:深圳市基础研究重点项目(JC201005270375A)
摘 要:本文围绕中国科学院深圳先进技术研究院认知技术研究中心自行研发的搭载有Kinect传感器的服务机器人操控平台,从Kinect传感器带来的彩色图像、深度图像和真三维点云信息中提取基于图像的2D和基于点云的3D特征,并将它们进行融合,作为待识别物体几何模型归类的依据,为手爪选择合适抓取姿态提供判断准则。同时结合人体示范学习框架(Learning from demonstration,LFD),研究了一种通过提高机器人的认知学习能力来完成人类生活环境中室内日常物品操控任务的方法,如:自行识别门把手的位置并完成开门动作,从橱柜中识别出目标物,抓取目标物体并送到指定目标地点等。最后,我们通过实验验证,该方法能够保证服务机器人成功抓取一些类似圆柱状、长方体等几何形态的物体并能在抓取之后顺利完成与周围环境进行交互过程中的轨迹规划这一复杂任务。This paper presents the integrated service robot control platform developed at the Cognitive Technology Research Center in Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. The platform is equiped with a sensor named Kinect. We can extract 2D and 3D features from the Kinect’s color images, depth images and point clouds, then classify the objects’ geometric models and choose the right grab attitude for the robot’s hand after those two types of features have been fused. Following the theory of Learning from demonstration (LfD), we have developed an approach to improve robot’s cognitive learning ability for indoor object manipulation tasks in everyday household environments, such as identifying the door handle’s location and opening the door, recognizing the targets in the cupboard, grabbing the object and sending it to the designated location, etc. Finally, we design an experiment to prove our method, which can teach the robot to grasp some cylindrical, rectangular or other geometrical objects and then finish the complicated task of planning the motion trajectory while interacting with the surrounding environment.
关 键 词:Kinect传感器 物品操控 服务机器人 认知学习
分 类 号:TP242[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.30