检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:郭书基 史立芳[2] 曹阿秀[2] 吴向东[1] 邓启凌[2]
机构地区:[1]西南交通大学机械工程学院机电测控系,成都610000 [2]中国科学院光电技术研究所,成都610209
出 处:《光子学报》2016年第5期87-92,共6页Acta Photonica Sinica
基 金:国家自然科学基金项目(No.61505214);中科院支撑项目(No.A11K030);西部之光和中国科学院青年创新促进会资助~~
摘 要:针对大视场目标探测提出了一种基于人工复眼大视场定位方法.通过分析子眼视场角与总视场角之间的关系,并结合多目视觉定位对子眼排布方式的要求,研究了包含多个子眼的人工复眼结构设计方法.通过分析子眼图像与三维空间映射关系,对二维图像进行裁剪并映射于三维立体空间,实现了二维子眼图像在三维空间的大视场拼接.利用子眼图像坐标、空间三维坐标及系统参数间的关系,建立了空间点多目定位数学模型,并编制目标定位算法.制备了包含19个子眼可实现120°大视场角的样机,通过张正友标定法获得系统参数,并进行目标定位实验.实验结果表明,使用设计的人工复眼大视场成像系统对5.35m处目标进行探测,定位误差为0.19%.Aiming at the target detection in a large field of view, a localization method was presentedbased on the artificial compound-eye. By analyzing the relationship between the sub eye Field of Viewangle and the total Field of View angle, combining with the consideration about the sub eye arrangementof multi-vision positioning, the designed method of artificial compound eye structure was studied whichcontains multiple sub eyes and can realize the detection in a large field of view. To analyze the three-dimensional space mapping law, the two dimensional images were cropped and mapped to 3D space, andthe large field of view in 3D space was realized. To analyze the relationship between the sub imagecoordinates and the three-dimensional coordinates, the mathematical model of multi vision object locationwas established and the target location algorithm was compiled. A prototype was manufactured whichcontains of 19 sub eyes and can achieve 120 degree angle of large field of view, and the system parameterswere obtained by Zhang Zheng-you calibration method. Then we carried out the 3D localizationexperiments. Experimental results show that the positioning error is 0. 19% when we use the artificialcompound eye imaging system to detect the target in 5.35 m.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28