检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:彭妍 郭君斌 于传强 李静波 PENG Yan;GUO Junbin;YU Chuanqiang;LI Jingbo(Missile Engineering Institute, Rocket Force University of Engineering, Xi’an 710025, China;Unit 96873 of the PLA, Baoji 721000, China)
机构地区:[1]火箭军工程大学导弹工程学院,陕西西安710025 [2]中国人民解放军96873部队,陕西宝鸡721000
出 处:《系统工程与电子技术》2022年第2期394-400,共7页Systems Engineering and Electronics
基 金:国家自然科学基金青年基金(61501470)资助课题。
摘 要:针对传统半全局算法对视差范围内未知场景通常人为地设定一个视差范围造成计算资源浪费,同时利用传统Census变换进行代价计算限制视差精度的不足,提出了基于视差范围估计和改进代价的半全局匹配算法。首先,采用多种特征算子同时提取图像对的特征点,通过快速最近邻搜索进行特征点匹配,利用立体匹配的约束条件筛选匹配点,计算匹配点对的视差值,估计视差范围;然后,在此基础上,分别对图像的亮度、梯度和边缘信息进行Census变换,构建新的代价计算函数。实验结果表明,与传统算法相比,改进算法的平均误匹配率降低了6.37%,计算时间缩短了95%以上。Traditional semi-global method usually sets a disparity range artificially for scenes with unknown disparity range,resulting in a waste of computing resources,and uses the traditional Census transform for cost computation,limiting the lack of parallax aceuracy.A semi-global matching method based on disparity range estimation and improved cost is proposed.First,a variety of feature operators are used to extract the feature points of the image pairs at the same time,and the fast nearest neighbor search method is used to match the feature points,the matching points are filtered by the constraints of stereo matching,and the disparity of the matching point pairs are computed to estimate the disparity range.Then,on this basis,the intensity,gradient and edge information of the image are processed by Census transform respectively to construct a new cost computation function.The experimental results demonstrate that,compared with the traditional method,the average error matching rate of the improved method is reduced by 6.37%,and the computation time is shortened by more than 95%.
关 键 词:立体匹配 半全局匹配 视差范围估计 代价计算 Census变换
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28