检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王慧赢 王春平 付强 韩子硕 张冬冬 WANG Huiying;WANG Chunping;FU Qiang;HAN Zishuo;ZHANG Dongdong(Department of Electronic and Optical Engineering,Shijiazhuang Campus,Army Engineering University of PLA,Shijiazhuang 050003,China;Unit 32356 of PLA,Xining 710003,China)
机构地区:[1]陆军工程大学石家庄校区电子与光学工程系,河北石家庄050003 [2]中国人民解放军32356部队,青海西宁710003
出 处:《系统工程与电子技术》2023年第8期2395-2404,共10页Systems Engineering and Electronics
基 金:军内科研项目(LJ20191A040155)资助课题。
摘 要:针对难以独立分析红外图像和低照度图像场景中目标和背景信息的问题,提出了基于图像特征的红外与低照度图像融合算法。首先,针对性地对红外和低照度图像进行图像处理,优化各自特性。其次,采用非下采样剪切波变换对两者进行高低频分解。然后,采用改进的拉普拉斯加权算法融合低频图像,采用改进的脉冲耦合神经网络融合高频图像。最后,经非下采样剪切波逆变换获取融合图像。实验结果表明,所提算法可有效融合红外与低照度图像,降低了噪声对融合图像的影响,提高了融合图像的清晰度,且图像对比度适中,符合人眼视觉感知效果。Aiming at the problem that it is difficult to analyze the target and background information in the scene of infrared images and low illumination images independently,an infrared and low illumination image fusion algorithm based on image features is proposed.Firstly,the infrared and low illumination images are processed to optimize their characteristics respectively.Secondly,the non-subsampled shearlet transform is used to decompose them in high and low frequency.Thirdly,the improved Laplace weighted algorithm is used to fuse the low-frequency images.The improved pulse coupled neural network is used to fuse the high-frequency images.Finally,the fused image is obtained by non-subsampled shearlet inverse transform.Experimental results show that the proposed algorithm can effectively fuse infrared and low illumination images,reduce the impact of noise on the fused image,improve the clarity of the fused image,and the contrast of the image is moderate,which is suifable the visual perception effect of human eyes.
关 键 词:低照度图像 红外图像 非下采样剪切波变换 脉冲耦合神经网络 图像融合算法
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.127