检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:汪剑鸣[1] 闫志杰[1] 段晓杰[1] 窦汝振[2] 冷宇[3]
机构地区:[1]天津工业大学信息与通信工程学院,天津300160 [2]中国汽车技术研究中心,天津300162 [3]苏州出入境检验检疫局,江苏苏州215021
出 处:《红外与激光工程》2010年第6期1168-1172,共5页Infrared and Laser Engineering
基 金:国家自然科学基金资助项目(60602036);天津市应用基础及前沿技术研究计划(10JCYBJC26300)
摘 要:相机自运动估计是视觉导航中的关键技术之一,主要是通过分析相机在不同位置拍摄到的场景图像来获取相机的运动信息。从数学上讲,相机自运动估计已经形成了完备的理论基础,但是由于图像中包含大量的噪声,会使算法的性能大幅度降低,因此,如何提高自运动估计的鲁棒性是当前面临的主要问题。主要研究了基于匹配点对的自运动估计的鲁棒性问题,其核心思想是:同时利用多种算法进行自运动参数估计,从中选择最优的估计结果以提高算法性能。首先利用SIFT特征提出两幅图像中的匹配点对,然后采用一种匹配点对选取策略减小匹配点对的错误率。利用多种方法对基本矩阵进行估计,依据成像约束关系从中选择最优估计,以获得最佳估计结果。最后利用仿真数据和实验图像对算法进行验证,实验结果表明了算法的有效性。Ego-motion estimation,which can retrieve motion information of a camera by analyzing images taken by the camera at different positions,has played an important role in vision navigation.Mathematically,perfect theoretical foundation of ego-motion estimation has been developed.However,noise is always found in images,which could depress the performance of ego-motion algorithms severely.So,at present the main problem of ego-motion estimation is how to develop robust algorithms against noise in images.This paper focused on the problems of robustness of ego-motion estimation algorithms,and the main idea was to improve robustness by adopting multiple methods for ego-motion estimation and find the optimal result in them.Firstly,SIFT was used to find correspondence point pairs between two images,and a scheme to refine the correspondence point pairs was proposed.Multiple methods were adopted to estimate the fundamental matrix and the optimal estimation was found by a rule deduced from imaging process.Finally,the algorithm was testified on both simulated data and real images,and the experimental results show the feasibility for improving robustness against noise.
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.171