检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]南京邮电大学材料科学与工程学院,南京210023
出 处:《中国图象图形学报》2016年第8期1048-1056,共9页Journal of Image and Graphics
基 金:国家自然科学基金项目(61474064);南京邮电大学基金项目(NY212076;NY212050)~~
摘 要:目的特征点匹配算法是当今计算机图像处理领域的研究热点,但是大多数现存的方法不能同时获得数量多和质量优的匹配。鉴于此,基于SURF(speeded—uprobustfeatures)算法,通过引入极线约束来提高特征匹配效果。方法首先使用SURF算法检测和描述图像特征点,然后使用RANSAC(randomsamplingconsensus)方法计算匹配图像之间的基础矩阵,通过该基础矩阵计算所有特征点的极线。再引入极线约束过滤掉错误匹配,最终获得数量与质量显著提高的匹配集合。结果实验结果表明,该方法获得的匹配具有高准确度,匹配数目与原约束条件相比可高达2~8倍。结论本文方法实现过程简单,不仅匹配准确度高且能够大大提高正确的特征匹配数,适用于处理不同类型的图像数据。Objective Feature matching is one of the most important research topics in the field of image processing. How- ever, most available methods fail to achieve satisfying quantitative and qualitative matches simultaneously. In this study, we introduced epipolar constraint into speeded-up robust features (SURF) feature matching, thereby achieving significant improvement. Method In this method, the SURF algorithm was adopted to detect the feature points of each studied image. Then, the fundamental matrix was calculated using random sample consensus (RANSAC) and was used to obtain the epi- polars of all the points. Finally, a constraint was introduced into the epipolars to filter error matches. Consequently, signifi- cantly improved matches with enhanced quantity and quality were achieved. Result The experimental results indicate that compared with the old method, our method cannot only obtain matches with high accuracy but can further achieve an in- crease of twofold to eightfold in quantity. Conclusion The process and implementation of the proposed method are simple and accurate. Moreover, the method can increase the number of correct matches and handle different types of images.
关 键 词:图像处理 特征匹配 极线约束 基础矩阵 SURF(speeded.up ROBUST features)
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222