检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]西北工业大学航海学院,陕西西安710072 [2]中航金属材料理化检测科技有限公司,陕西西安710018
出 处:《西北工业大学学报》2014年第4期569-575,共7页Journal of Northwestern Polytechnical University
基 金:航天科技创新基金(CASC201102)资助
摘 要:针对精确制导武器系统中,利用传统方法获取的融合图像使得红外目标模糊、识别率低、定位性差及不能继承可见光图像色彩特性而出现光谱扭曲与失真的现象,提出了一种基于区域分割和提升小波变换的红外与可见光图像融合方法。首先结合区域生长与边缘提取图像分割法,将红外图像背景区域与目标区域分开;其次采用像素邻域能量取大法,将红外目标区域映射到可见光背景中;最后将上步得到的融合图像与原图像进行低频加权,高频平均梯度的提升小波融合变换,防止因图像分割所形成的拼接错误而导致重要信息丢失现象。实验结果表明:融合后的图像,目标凸显,背景自然,能够达到准确定位与快速识别的目的,并对隐藏目标的检测有着重要的指导意义。As regards precision-guided weapons systems, the fused images obtained by traditional methods give fuzzy detection , low recognition rate and poor positioning for infrared target;meanwhile they are unable to highlight the visible color characteristics;thus spectral distortion results. We present a fusion method of region-based segmen-tation and lifting wavelet transform for infrared and visible image. We do three things: (1) making the infrared background and destination areas separate with image segmentation methods of regional growth combined with edge detection;(2) using the maximum energy around pixel neighborhood to make infrared target mapped to the visible background;(3) for the fused image acquired by the step above-mentioned steps and the original images, utilizing the lifting wavelet transform about the weighted algorithm for low frequency and the average gradient for high fre-quency, thus avoiding important information being missed because of segmentation error. The experimental results and their analysis show preliminarily that:the fused image can highlight target, make background natural, achieve the purpose of accurate positioning and rapid identification, thus having an important indication significance for de-tecting the hidden targets.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28