检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:欧阳帅[1] 安博文[1] 周凡[1] 曹芳[1] OUYANG Shuai AN Bo-wen ZHOU Fan CAO Fang(College of Information Engineering, Shanghai Maritime University, Shanghai 201306, China)
出 处:《传感器与微系统》2017年第9期113-116,共4页Transducer and Microsystem Technologies
基 金:国家自然科学基金资助项目(61171126);上海市重点支撑资助项目(12250501500);广西教育厅科研项目(YB2014207)
摘 要:针对海事监管中航拍图像拼接生成大视场图像的时效性较低以及配准准确性不高的问题,提出了一种快速高效的无人机(UAV)航拍图像拼接算法。根据海事监管辖区航拍图像特点缩小了角点搜索范围,通过设定自适应的梯度阈值和角点响应函数阈值筛选角点,通过局部最大角点响应函数值取舍准则实现了角点均匀化分布;采用基于相位相关的模板粗匹配方法和带有特征约束的RANSAC细匹配方法求出最优变换矩阵;利用人眼的视觉特性改进传统加权平均融合算法的加权因子使图像拼接过渡自然。实验结果表明:算法具有较好的自适应性,在拼接效率和准确率上较传统算法有了很大改善。A fast and efficient image mosaic algorithm for unmanned aerial vehicle( UAV) aerial image is proposed,which is able to cope with the problems of low speed and accuracy of generating large field of view image with aerial image mosaic in maritime supervision. First,scanning range of the angular point detection is narrowed according to the feature of aerial image. The gradient threshold and corner response function threshold are used to extract Harris corner. Keeping the max value of local corner response function value is adopted to uniform the feature point distribution. After that,the improved phase correlation of template matching algorithm and improved RANSAC matching algorithm with constraint features are applied to obtain the optimal transformation matrix. Last,according to human visual characteristics,the improved factor of weighted average image fusion algorithm is applied to obtain a seamless image. Experimental results show that the algorithm has better adaptability. And the method overcomes the shortcoming of traditional mosaic methods,since the efficiency and accuracy of stitching are improved greatly.
关 键 词:航拍图像 角点检测 自适应阈值 图像配准 图像融合
分 类 号:TP317.4[自动化与计算机技术—计算机软件与理论]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.135.201.186