检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]国防科技大学ATR实验室,长沙410073 [2]空军装备研究院,北京100085
出 处:《中国图象图形学报》2009年第11期2373-2377,共5页Journal of Image and Graphics
基 金:国家自然科学基金项目(60302007);国防科技重点实验室基金项目(914008004010611;914008002010705)
摘 要:精确的亚像素级图像配准是图像超分辨重建中的关键问题。在图像超分辨重建中广泛使用的基于像素特征的光流法,对于大幅度运动场的计算很难做到精确的亚像素级配准。本文考虑了一种基于SIFT(scaleinvariant feature transform)特征的鲁棒性多帧图像超分辨重建算法。首先提取输入的低分辨待匹配图像对的SIFT关键点及其特征矢量,随后选取候选匹配关键点对,通过RANSAC(random sample consensus)鲁棒方法去除奇异值,并根据假设的平移性几何约束模型,获得图像对的平移运动配准参数,然后选取视场中心对应的或指定的图像帧为初始参考帧,再使用传统的超分辨重建框架获得最终的重建结果。仿真实验结果表明,提出的基于SIFT特征的图像超分辨重建方案是有效的,超分辨重建的图像质量在主观评价和客观指标上都获得了优于经典算法的效果。Accurate sub-pixel image registration is a key problem in image super-resolution reconstruction. Optical flow methods based on pixel feature, which are widely used in image super-resolution reconstruction, are difficult to achieve registration of sub-pixel accuracy for large motion field. This paper considered a robust multi-frame image super-resolution reconstruction method based on SIFT. Firstly, SIFT operator was used to pick up keypoints and their descriptors of input low-resolution image pairs which are to be registered. Then the candidate keypoint pair was selected, outliers were wiped off through RANSAC, and images pair displacement was computed at the basis of assumed transitional geometry constraint model. Secondly, initial reference frame was selected from vision center frame or specified image frame. Lastly, super- resolution reconstruction was done through conventional super-resolution reconstruction framework. Experimental results show that the proposed image super-resolution reconstruction method based on SIFT is feasible, and the quality of super- resolution reconstructed images is better than those of classical methods by both subjective evaluation and objective standards.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.117

