检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]中国科学院长春光学精密机械与物理研究所,吉林长春130033
出 处:《红外与激光工程》2014年第8期2757-2764,共8页Infrared and Laser Engineering
基 金:国家高技术研究发展计划(2007AA12Z113)
摘 要:为了提高多光谱与全色图像融合算法质量,提出了一种采用区域互信息的多光谱与全色图像融合算法。首先将多光谱图像变换至HSV彩色空间,并采用分水岭与区域合并的方法对V分量进行区域分割,得到区域分割映射,欧氏光谱距离作为区域合并的测度。然后采用非下采样Contourlet变换(Nonsubsample Contourlet Transform,NSCT)对多光谱图像V分量和全色图像进行多分辨率分解,将区域分割结果映射至全色图像,通过计算对应区域间的互信息对多分辨率分解系数进行融合,获得融合图像的分解系数,最后通过NSCT反变换实现融合图像重构。图像融合算法对比实验表明,文中融合算法在充分保留了多光谱图像光谱信息的同时,尽可能多地注入了全色图像的细节信息,有效提高了多光谱图像的边缘特征。For advancing the fusion algorithm quality of multi-spectral and panchromatic images, an image fusion algorithm of multi-spectral and panchromatic images was presented by region mutual information. Firstly the multi-spectral image was turned to HSV color space, and region segmentation was applied to V component by the method of watershed and region combination, and spectral distance was taken as the region combination estimation. Secondly the V component of multi-spectral image and panchromatic image were decomposed multi-resolution by nonsubsample Contourlet transform(NSCT), the region segmentation was mapping to the panchromatic image, and the multi-resolution decomposition coefficient was fused by calculating the mutual information of corresponding region to obtain the decomposition coefficient of fusion image. Lastly the fusion image reconfiguration was realized through NSCT inverse transform. The experimental result shows that the image fusion algorithm presented by this paper retains the spectral information of multi-spectral image adequately, meanwhile injects details information of panchromatic image as much as possible, which advances the edge characteristic of multi- spectral image effectively.
关 键 词:图像融合 图像分割 非下采样CONTOURLET变换 区域互信息
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.145