检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:邹序焱 何汉武[1,2] 吴悦明[1] 邓景威 Zou Xuyan;He Hanwu;Wu Yueming;Deng Jingwei(Guangdong University of Technology,Guangzhou 510006,China;Guangdong Polytechnic of Industry and Commerce,Guangzhou 510510,China;Department of Artificial Intelligence and Big Data,Yibin university,Yibin 644007,China)
机构地区:[1]广东工业大学机电工程学院,广东广州510006 [2]广东工贸职业技术学院,广东广州510510 [3]宜宾学院人工智能与大数据学部,四川宜宾644007
出 处:《系统仿真学报》2021年第10期2488-2498,共11页Journal of System Simulation
基 金:国家重点研发专项(2018YFB1004902);广东省重点研发项目(2017B010110008)。
摘 要:在实验教学中,有些实验因实验环境、实验设备、师资等限定而不能开展。为了解决上述问题,探讨了利用深度相机构建虚实融合仿真实验的技术。利用Aruco标记算法实现了对虚拟场景与真实场景一一对应配准,结合深度相机实时采集的彩色图像和深度图像,构建出虚实融合的实验环境。为了达到操作虚拟器材的目的,利用阈值分割和成像原理,提出了一种手势交互的方法。数值实验表明位姿配准和手势交互方法基本满足虚拟实验的应用要求,为虚拟实验的构建提供了一套可行的技术方案。In experimental teaching, some traditional experiments cannot be carried out due to limitations of experimental environment, experimental equipment, and faculty. In order to solve the above problems, the method of constructing virtual and real fusion simulation experiment using depth cameras is explored.. The Aruco labeling algorithm is used to achieve one-to-one registration of the virtual scene and the real scene, and the experimental environment of virtual and real fusion is constructed combining the colour images and depth images collected by the depth camera in real time. In order to achieve the purpose of operating virtual equipment, a gesture interaction method is proposed based on threshold segmentation and imaging principles. Numerical experiments show that pose registration and gesture interaction methods basically meet the application requirements of virtual experiments, and provide a set of feasible technical solutions for the construction of virtual experiments.
关 键 词:虚拟实验 虚实配准 实时融合 手势交互 深度相机
分 类 号:TP391.9[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.133.157.170